entropy-logo

Journal Browser

Journal Browser

Information Theoretic Security and Privacy of Information Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (28 February 2021) | Viewed by 20678

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Physical and Information Technologies (ITEFI), Spanish National Research Council (CSIC), 28006 Madrid, Spain
Interests: cryptography; privacy; smart cybersecurity; blockchain; IT governance; trust management and trustworthiness
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
IBM Research Zurich, 8803 Rüschlikon, Switzerland
Interests: applied cryptography; privacy engineering; provable cryptography; distributed systems; theoretical computer science

Special Issue Information

Dear Colleagues,

The use of cryptography to protect our information is well known among scientists and technologists. However, knowledge of the limitations of the traditional approaches of encryption or obfuscation concerning the privacy level they ensure is not so widespread. In engineering, it is even common to come across claims of purportedly anonymized data sets that have been only subject to removal of their main identifiers. This becomes even more dangerous given the vast and rich amounts of data that large organizations and governments control. Here, a tradeoff is typically inherent: If robust desensitization techniques are applied, the data set loses utility; but if we are lenient in the process of removing sensitive data in order to ensure utility, privacy is threatened—especially if we take into account the constraints that can be set by the use case, in terms of who controls what information, or the required throughput of the process. Somehow orthogonally, but also affecting the right to privacy and boosted by the easy access to vast amounts of data, there is a growing need to evaluate the veracity of information, as it becomes easier to artificially generate media that appear legitimate to human perception. In any of these situations, objective evaluations of either privacy or data quality, derived from information-theoretic techniques, is highly desirable.

All these challenges call for an interdisciplinary approach where cryptographers, privacy engineers, and data scientists need to collaborate in order to address the theoretical, engineering, and business needs, where robust data curation mechanisms, encompassing techniques such as user-centric security ceremonies, privacy-preserving, and cryptographic techniques, have to come together to provide a multilayered approach.

Dr. David Arroyo
Dr. Jesus Diaz Vico
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Theoretic data privacy
  • Cryptographic engineering
  • Privacy engineering
  • Differential privacy
  • Utility–privacy tradeoff
  • Big data
  • User-centric security

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 420 KiB  
Article
A Compression-Based Method for Detecting Anomalies in Textual Data
by Gonzalo de la Torre-Abaitua, Luis Fernando Lago-Fernández and David Arroyo
Entropy 2021, 23(5), 618; https://0-doi-org.brum.beds.ac.uk/10.3390/e23050618 - 16 May 2021
Cited by 8 | Viewed by 2546
Abstract
Nowadays, information and communications technology systems are fundamental assets of our social and economical model, and thus they should be properly protected against the malicious activity of cybercriminals. Defence mechanisms are generally articulated around tools that trace and store information in several ways, [...] Read more.
Nowadays, information and communications technology systems are fundamental assets of our social and economical model, and thus they should be properly protected against the malicious activity of cybercriminals. Defence mechanisms are generally articulated around tools that trace and store information in several ways, the simplest one being the generation of plain text files coined as security logs. Such log files are usually inspected, in a semi-automatic way, by security analysts to detect events that may affect system integrity, confidentiality and availability. On this basis, we propose a parameter-free method to detect security incidents from structured text regardless its nature. We use the Normalized Compression Distance to obtain a set of features that can be used by a Support Vector Machine to classify events from a heterogeneous cybersecurity environment. In particular, we explore and validate the application of our method in four different cybersecurity domains: HTTP anomaly identification, spam detection, Domain Generation Algorithms tracking and sentiment analysis. The results obtained show the validity and flexibility of our approach in different security scenarios with a low configuration burden. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

15 pages, 350 KiB  
Article
Energy Efficiency Optimization in Massive MIMO Secure Multicast Transmission
by Bin Jiang, Linbo Qu, Yufei Huang, Yifei Zheng, Li You and Wenjin Wang
Entropy 2020, 22(10), 1145; https://0-doi-org.brum.beds.ac.uk/10.3390/e22101145 - 12 Oct 2020
Cited by 2 | Viewed by 1827
Abstract
Herein, we focus on energy efficiency optimization for massive multiple-input multiple-output (MIMO) downlink secure multicast transmission exploiting statistical channel state information (CSI). Privacy engineering in the field of communication is a hot issue under study. The common signal transmitted by the base station [...] Read more.
Herein, we focus on energy efficiency optimization for massive multiple-input multiple-output (MIMO) downlink secure multicast transmission exploiting statistical channel state information (CSI). Privacy engineering in the field of communication is a hot issue under study. The common signal transmitted by the base station is multicast transmitted to multiple legitimate user terminals in our system, but an eavesdropper might eavesdrop this signal. To achieve the energy efficiency utility–privacy trade-off of multicast transmission, we set up the problem of maximizing the energy efficiency which is defined as the ratio of the secure transmit rate to the power consumption. To simplify the formulated nonconvex problem, we use a lower bound of the secure multicast rate as the molecule of the design objective. We then obtain the eigenvector of the optimal transmit covariance matrix into a closed-form, simplifying the matrix-valued multicast transmission strategy problem into a power allocation problem in the beam domain. By utilizing the Minorize-Maximize method, an iterative algorithm is proposed to decompose the secure energy efficiency optimization problem into a sequence of iterative fractional programming subproblems. By using Dinkelbach’s transform, each subproblem becomes an iterative problem with the concave objective function, and it can be solved by classical convex optimization. We guarantee the convergence of the two-level iterative algorithm that we propose. Besides, we reduce the computational complexity of the algorithm by substituting the design objective with its deterministic equivalent. The numerical results show that the approach we propose performs well compared with the conventional methods. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

13 pages, 798 KiB  
Article
High Efficiency Continuous-Variable Quantum Key Distribution Based on ATSC 3.0 LDPC Codes
by Kun Zhang, Xue-Qin Jiang, Yan Feng, Runhe Qiu and Enjian Bai
Entropy 2020, 22(10), 1087; https://0-doi-org.brum.beds.ac.uk/10.3390/e22101087 - 27 Sep 2020
Cited by 8 | Viewed by 2412
Abstract
Due to the rapid development of quantum computing technology, encryption systems based on computational complexity are facing serious threats. Based on the fundamental theorem of quantum mechanics, continuous-variable quantum key distribution (CVQKD) has the property of physical absolute security and can effectively overcome [...] Read more.
Due to the rapid development of quantum computing technology, encryption systems based on computational complexity are facing serious threats. Based on the fundamental theorem of quantum mechanics, continuous-variable quantum key distribution (CVQKD) has the property of physical absolute security and can effectively overcome the dependence of the current encryption system on the computational complexity. In this paper, we construct the spatially coupled (SC)-low-density parity-check (LDPC) codes and quasi-cyclic (QC)-LDPC codes by adopting the parity-check matrices of LDPC codes in the Advanced Television Systems Committee (ATSC) 3.0 standard as base matrices and introduce these codes for information reconciliation in the CVQKD system in order to improve the performance of reconciliation efficiency, and then make further improvements to final secret key rate and transmission distance. Simulation results show that the proposed LDPC codes can achieve reconciliation efficiency of higher than 0.96. Moreover, we can obtain a high final secret key rate and a long transmission distance through using our proposed LDPC codes for information reconciliation. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

34 pages, 1581 KiB  
Article
Designing Two Secure Keyed Hash Functions Based on Sponge Construction and the Chaotic Neural Network
by Nabil Abdoun, Safwan El Assad, Thang Manh Hoang, Olivier Deforges, Rima Assaf and Mohamad Khalil
Entropy 2020, 22(9), 1012; https://0-doi-org.brum.beds.ac.uk/10.3390/e22091012 - 10 Sep 2020
Cited by 17 | Viewed by 4222
Abstract
In this paper, we propose, implement, and analyze the structures of two keyed hash functions using the Chaotic Neural Network (CNN). These structures are based on Sponge construction, and they produce two variants of hash value lengths, i.e., 256 and 512 bits. The [...] Read more.
In this paper, we propose, implement, and analyze the structures of two keyed hash functions using the Chaotic Neural Network (CNN). These structures are based on Sponge construction, and they produce two variants of hash value lengths, i.e., 256 and 512 bits. The first structure is composed of two-layered CNN, while the second one is formed by one-layered CNN and a combination of nonlinear functions. Indeed, the proposed structures employ two strong nonlinear systems, precisely a chaotic system and a neural network system. In addition, the proposed study is a new methodology of combining chaotic neural networks and Sponge construction that is proved secure against known attacks. The performance of the two proposed structures is analyzed in terms of security and speed. For the security measures, the number of hits of the two proposed structures doesn’t exceed 2 for 256-bit hash values and does not exceed 3 for 512-bit hash values. In terms of speed, the average number of cycles to hash one data byte (NCpB) is equal to 50.30 for Structure 1, and 21.21 and 24.56 for Structure 2 with 8 and 24 rounds, respectively. In addition, the performance of the two proposed structures is compared with that of the standard hash functions SHA-3, SHA-2, and with other classical chaos-based hash functions in the literature. The results of cryptanalytic analysis and the statistical tests highlight the robustness of the proposed keyed hash functions. It also shows the suitability of the proposed hash functions for the application such as Message Authentication, Data Integrity, Digital Signature, and Authenticated Encryption with Associated Data. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

21 pages, 1425 KiB  
Article
A Novel Image-Encryption Scheme Based on a Non-Linear Cross-Coupled Hyperchaotic System with the Dynamic Correlation of Plaintext Pixels
by Wenjin Hou, Shouliang Li, Jiapeng He and Yide Ma
Entropy 2020, 22(7), 779; https://0-doi-org.brum.beds.ac.uk/10.3390/e22070779 - 17 Jul 2020
Cited by 8 | Viewed by 2466
Abstract
Based on a logistic map and Feigenbaum map, we proposed a logistic Feigenbaum non-linear cross-coupled hyperchaotic map (LF-NCHM) model. Experimental verification showed that the system is a hyperchaotic system. Compared with the existing cross-coupled mapping, LF-NCHM demonstrated a wider hyperchaotic range, better ergodicity [...] Read more.
Based on a logistic map and Feigenbaum map, we proposed a logistic Feigenbaum non-linear cross-coupled hyperchaotic map (LF-NCHM) model. Experimental verification showed that the system is a hyperchaotic system. Compared with the existing cross-coupled mapping, LF-NCHM demonstrated a wider hyperchaotic range, better ergodicity and richer dynamic behavior. A hyperchaotic sequence with the same number of image pixels was generated by LF-NCHM, and a novel image-encryption algorithm with permutation that is dynamically related to plaintext pixels was proposed. In the scrambling stage, the position of the first scrambled pixel was related to the sum of the plaintext pixel values, and the positions of the remaining scrambled pixels were related to the pixel values after the previous scrambling. The scrambling operation also had a certain diffusion effect. In the diffusion phase, using the same chaotic sequence as in the scrambling stage increased the usage rate of the hyperchaotic sequence and improved the calculation efficiency of the algorithm. A large number of experimental simulations and cryptanalyses were performed, and the results proved that the algorithm had outstanding security and extremely high encryption efficiency. In addition, LF-NCHM could effectively resist statistical analysis attacks, differential attacks and chosen-plaintext attacks. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

28 pages, 3637 KiB  
Article
Anomaly Detection for Individual Sequences with Applications in Identifying Malicious Tools
by Shachar Siboni and Asaf Cohen
Entropy 2020, 22(6), 649; https://0-doi-org.brum.beds.ac.uk/10.3390/e22060649 - 12 Jun 2020
Cited by 4 | Viewed by 3405
Abstract
Anomaly detection refers to the problem of identifying abnormal behaviour within a set of measurements. In many cases, one has some statistical model for normal data, and wishes to identify whether new data fit the model or not. However, in others, while there [...] Read more.
Anomaly detection refers to the problem of identifying abnormal behaviour within a set of measurements. In many cases, one has some statistical model for normal data, and wishes to identify whether new data fit the model or not. However, in others, while there are normal data to learn from, there is no statistical model for this data, and there is no structured parameter set to estimate. Thus, one is forced to assume an individual sequences setup, where there is no given model or any guarantee that such a model exists. In this work, we propose a universal anomaly detection algorithm for one-dimensional time series that is able to learn the normal behaviour of systems and alert for abnormalities, without assuming anything on the normal data, or anything on the anomalies. The suggested method utilizes new information measures that were derived from the Lempel–Ziv (LZ) compression algorithm in order to optimally and efficiently learn the normal behaviour (during learning), and then estimate the likelihood of new data (during operation) and classify it accordingly. We apply the algorithm to key problems in computer security, as well as a benchmark anomaly detection data set, all using simple, single-feature time-indexed data. The first is detecting Botnets Command and Control (C&C) channels without deep inspection. We then apply it to the problems of malicious tools detection via system calls monitoring and data leakage identification.We conclude with the New York City (NYC) taxi data. Finally, while using information theoretic tools, we show that an attacker’s attempt to maliciously fool the detection system by trying to generate normal data is bound to fail, either due to a high probability of error or because of the need for huge amounts of resources. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

53 pages, 13998 KiB  
Article
Novel Models of Image Permutation and Diffusion Based on Perturbed Digital Chaos
by Thang Manh Hoang and Safwan El Assad
Entropy 2020, 22(5), 548; https://0-doi-org.brum.beds.ac.uk/10.3390/e22050548 - 13 May 2020
Cited by 10 | Viewed by 3017
Abstract
Most of chaos-based cryptosystems utilize stationary dynamics of chaos for the permutation and diffusion, and many of those are successfully attacked. In this paper, novel models of the image permutation and diffusion are proposed, in which chaotic map is perturbed at bit level [...] Read more.
Most of chaos-based cryptosystems utilize stationary dynamics of chaos for the permutation and diffusion, and many of those are successfully attacked. In this paper, novel models of the image permutation and diffusion are proposed, in which chaotic map is perturbed at bit level on state variables, on control parameters or on both. Amounts of perturbation are initially the coordinate of pixels in the permutation, the value of ciphered word in the diffusion, and then a value extracted from state variables in every iteration. Under the persistent perturbation, dynamics of chaotic map is nonstationary and dependent on the image content. The simulation results and analyses demonstrate the effectiveness of the proposed models by means of the good statistical properties of transformed image obtained after just only a single round. Full article
(This article belongs to the Special Issue Information Theoretic Security and Privacy of Information Systems)
Show Figures

Figure 1

Back to TopTop