Shannon's Information Theory and Its Applications

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (22 December 2022) | Viewed by 8980

Special Issue Editors


E-Mail Website1 Website2
Guest Editor
School of Mathematics & Physics, University of Portsmouth, Burnaby Road, Burnaby Building, Office 2.13, Portsmouth PO1 3QL, UK
Interests: condensed matter physics research with emphasis on the fundamental properties and applications of ferroic and multiferroic materials; the development of novel measurement techniques; advanced data storage technologies and theoretical studies of non-equilibrium phenomena; fundamental physics and information physics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Mathematics & Physics, University of Portsmouth, Portsmouth PO1 3QL, UK
Interests: research mainly focused on how to apply quantum information theory into practical quantum technologies in quantum optics and superconducting circuits

Special Issue Information

Dear Colleagues,

The classical Information Theory, developed by Claude Shannon in his seminal 1948 work, has already facilitated technological breakthroughs in a diverse range of subjects, such as computing, cryptography, telecommunications, physiology, linguistics, biology, geology, biochemical signaling, mathematics and physics. In particular, the application of the concept of information entropy, derived from Shannon’s original work, continues to influence all branches of science today.

In this Special Issue, we aim to address the latest ideas, developments and advances involving any aspects of Shannon’s Information Theory and its wider application across all sciences.

We welcome research letters, original papers and review article contributions to this Special Issue of the Applied Sciences journal.

Dr. Melvin M. Vopson
Dr. Jaewoo Joo
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

30 pages, 4653 KiB  
Article
Classification of the Central Effects of Transcutaneous Electroacupuncture Stimulation (TEAS) at Different Frequencies: A Deep Learning Approach Using Wavelet Packet Decomposition with an Entropy Estimator
by Çağlar Uyulan, David Mayor, Tony Steffert, Tim Watson and Duncan Banks
Appl. Sci. 2023, 13(4), 2703; https://0-doi-org.brum.beds.ac.uk/10.3390/app13042703 - 20 Feb 2023
Cited by 2 | Viewed by 2507
Abstract
The field of signal processing using machine and deep learning algorithms has undergone significant growth in the last few years, with a wide scope of practical applications for electroencephalography (EEG). Transcutaneous electroacupuncture stimulation (TEAS) is a well-established variant of the traditional method of [...] Read more.
The field of signal processing using machine and deep learning algorithms has undergone significant growth in the last few years, with a wide scope of practical applications for electroencephalography (EEG). Transcutaneous electroacupuncture stimulation (TEAS) is a well-established variant of the traditional method of acupuncture that is also receiving increasing research attention. This paper presents the results of using deep learning algorithms on EEG data to investigate the effects on the brain of different frequencies of TEAS when applied to the hands in 66 participants, before, during and immediately after 20 min of stimulation. Wavelet packet decomposition (WPD) and a hybrid Convolutional Neural Network Long Short-Term Memory (CNN-LSTM) model were used to examine the central effects of this peripheral stimulation. The classification results were analysed using confusion matrices, with kappa as a metric. Contrary to expectation, the greatest differences in EEG from baseline occurred during TEAS at 80 pulses per second (pps) or in the ‘sham’ (160 pps, zero amplitude), while the smallest differences occurred during 2.5 or 10 pps stimulation (mean kappa 0.414). The mean and CV for kappa were considerably higher for the CNN-LSTM than for the Multilayer Perceptron Neural Network (MLP-NN) model. As far as we are aware, from the published literature, no prior artificial intelligence (AI) research appears to have been conducted into the effects on EEG of different frequencies of electroacupuncture-type stimulation (whether EA or TEAS). This ground-breaking study thus offers a significant contribution to the literature. However, as with all (unsupervised) DL methods, a particular challenge is that the results are not easy to interpret, due to the complexity of the algorithms and the lack of a clear understanding of the underlying mechanisms. There is therefore scope for further research that explores the effects of the frequency of TEAS on EEG using AI methods, with the most obvious place to start being a hybrid CNN-LSTM model. This would allow for better extraction of information to understand the central effects of peripheral stimulation. Full article
(This article belongs to the Special Issue Shannon's Information Theory and Its Applications)
Show Figures

Graphical abstract

10 pages, 1723 KiB  
Communication
Shannon Entropy Analysis of Reservoir-Triggered Seismicity at Song Tranh 2 Hydropower Plant, Vietnam
by Luciano Telesca, Anh Tuan Thai, Michele Lovallo, Dinh Trong Cao and Le Minh Nguyen
Appl. Sci. 2022, 12(17), 8873; https://0-doi-org.brum.beds.ac.uk/10.3390/app12178873 - 04 Sep 2022
Cited by 4 | Viewed by 1319
Abstract
The reservoir-triggered seismicity at the Song Tranh 2 reservoir in Vietnam is investigated by using Shannon entropy, a well-known informational method used to analyze complexity in time series in terms of disorder and uncertainty. The application of the time-varying Shannon entropy to the [...] Read more.
The reservoir-triggered seismicity at the Song Tranh 2 reservoir in Vietnam is investigated by using Shannon entropy, a well-known informational method used to analyze complexity in time series in terms of disorder and uncertainty. The application of the time-varying Shannon entropy to the time series of the interevent times of seismicity has evidenced clear links with the temporal fluctuations of the water level of the reservoir, strengthening the belief that the reservoir operational regime is one of the sources of the seismicity occurring in the area. Shannon entropy has also shed light on the tectonic mechanisms of generation of reservoir-triggered seismicity, revealing that the change in stress due to the variation in water level causes the seismic system to be in a state of greater disorder and instability, well depicted by Shannon entropy, which would lead to an increase in seismic activity. Full article
(This article belongs to the Special Issue Shannon's Information Theory and Its Applications)
Show Figures

Figure 1

8 pages, 1032 KiB  
Article
A Possible Information Entropic Law of Genetic Mutations
by Melvin M. Vopson
Appl. Sci. 2022, 12(14), 6912; https://0-doi-org.brum.beds.ac.uk/10.3390/app12146912 - 08 Jul 2022
Cited by 5 | Viewed by 2158
Abstract
The current scientific consensus is that genetic mutations are random processes. According to the Darwinian theory of evolution, only natural selection determines which mutations are beneficial in the course of evolution, and there is no deterministic correlation between any parameter and the probability [...] Read more.
The current scientific consensus is that genetic mutations are random processes. According to the Darwinian theory of evolution, only natural selection determines which mutations are beneficial in the course of evolution, and there is no deterministic correlation between any parameter and the probability that these mutations will occur. Here, we investigate RNA genetic sequences of the SARS-CoV-2 virus using Shannon’s information theory, and we report a previously unobserved relationship between the information entropy of genomes and their mutation dynamics. Based on the analysis presented here, we are able to formulate a governing law of genetic mutations, stating that genomes undergo genetic mutations over time driven by a tendency to reduce their overall information entropy, challenging the existing Darwinian paradigm. Full article
(This article belongs to the Special Issue Shannon's Information Theory and Its Applications)
Show Figures

Figure 1

12 pages, 2460 KiB  
Article
Shannon (Information) Measures of Symmetry for 1D and 2D Shapes and Patterns
by Edward Bormashenko, Irina Legchenkova, Mark Frenkel, Nir Shvalb and Shraga Shoval
Appl. Sci. 2022, 12(3), 1127; https://0-doi-org.brum.beds.ac.uk/10.3390/app12031127 - 21 Jan 2022
Cited by 4 | Viewed by 1602
Abstract
In this paper, informational (Shannon) measures of symmetry are introduced and analyzed for patterns built of 1D and 2D shapes. The informational measure of symmetry Hsym(G) characterizes the averaged uncertainty in the presence of symmetry elements from [...] Read more.
In this paper, informational (Shannon) measures of symmetry are introduced and analyzed for patterns built of 1D and 2D shapes. The informational measure of symmetry Hsym(G) characterizes the averaged uncertainty in the presence of symmetry elements from group G in a given pattern, whereas the Shannon-like measure of symmetry Ωsym(G) quantifies the averaged uncertainty of the appearance of shapes possessing a total of n elements of symmetry belonging to group G in a given pattern. Hsym(G1)=Ωsym(G1)=0 for the patterns built of irregular, non-symmetric shapes, where G1 is the identity element of the symmetry group. Both informational measures of symmetry are intensive parameters of the pattern and do not depend on the number of shapes, their size, and the entire area of the pattern. They are also insensitive to the long-range order (translational symmetry) inherent for the pattern. Additionally, informational measures of symmetry of fractal patterns are addressed, the mixed patterns including curves and shapes are considered, the time evolution of Shannon measures of symmetry are examined, the close-packed and dispersed 2D patterns are analyzed, and an application of the suggested measures of symmetry for the analysis of the chemical reaction is demonstrated. Full article
(This article belongs to the Special Issue Shannon's Information Theory and Its Applications)
Show Figures

Figure 1

Back to TopTop