Mass-Energy-Information equivalence principle

Landauer’s principle formulated in 1961 states that logical irreversibility implies physical irreversibility and demonstrated that information is physical. A new of mass-energy-information (M-E-I) equivalence principle proposes that a bit of information is not just physical, as already demonstrated, but it has a finite and quantifiable mass while it stores information. The M-E-I equivalence principle proposed in 2019 is an extension of Landauer’s principle stating that, if information is equivalent to energy, according to Landauer, and if energy is equivalent to mass, according to Einstein’s special relativity, then the triad of mass, energy and information must all be equivalent, too (i.e. if M = E and E = I, then M = E = I). The M-E-I equivalence principle generated a number of interesting ramifications in physics including the concept of information being the fifth state of matter in the universe and possibly the missing dark matter. In this framework, it was shown that the mass of a bit of information at room temperature (300K) is 3.19 x 10^ -38 Kg. 

Testing the fifth state of matter, simulation hypothesis and the M-E-I principle

The mass-energy-information equivalence principle and the information content of the observable matter in the universe, represent two important conjectures, called the information conjectures. Combining information theory and physical principles of thermodynamics, these theoretical proposals made specific predictions about the mass of information, as well as the most probable information content per elementary particle. This experimental protocol allows empirical verification of the information conjectures, by confirming the predicted information content of elementary particles. The experiment involves a matter – antimatter annihilation process. When electron – positron annihilates, in addition to the two 511 keV gamma photons resulting from the conversion of their rest masses into energy, we predict that two additional low energy photons should be detected, resulting from their information content erasure. At room temperature, a positron – electron annihilation should produce two ~50 mm wavelength infrared photons due to the information erasure. This experiment could therefore confirm both information conjectures and the existence of information as the 5th state of matter in the universe. 

Link to support our experiment 

Information catastrophe - power requirements singularity 

Currently we produce ~1021 digital bits of information annually on Earth. Assuming 20% annual growth rate, we estimate that ~350 years from now, the number of bits produced will exceed the number of all atoms on Earth, ~1050. After ~246 years, the power required to sustain this digital production will exceed 18.5 TW, i.e. the total planetary power consumption today, and ~500 years from now the digital content will account for more than half of the Earth’s mass, according to the mass-energy-information equivalence principle. Besides the existing global challenges, our estimates point to another singularity event for our planet, called the Information Catastrophe. 

Information theory used to detect and predict genetic mutations

The current scientific consensus is that genetic mutations are random processes. According to the Darwinian theory of evolution, only natural selection determines which mutations are beneficial in the course of evolution, and there is no deterministic correlation between any parameter and the probability that these mutations will occur or not. Investigating RNA genetic sequences of the SARS-CoV-2 virus using Shannon’s information theory a previously unobserved relationship between the information entropy of genomes and their mutation dynamics has been observed. This allowed the formulation of a governing law of genetic mutations, stating that genomes undergo genetic mutations over time driven by a tendency to reduce their overall information entropy, challenging the existing Darwinian paradigm. This is a paradigm shift in genetic technologies, as it opens up to developing a deterministic way of predicting genetics mutations before they occur. It has massive implication for pandemic monitoring, virology, cancer research, genetic therapies, evolution theory, etc 

Second law of Information Dynamics

One of the most powerful laws in physics is the second law of thermodynamics, which states that the entropy of any isolated system remains constant or increases over time. In fact, the second law is applicable to the evolution of the entire universe and Clausius stated, “The entropy of the universe tends to a maximum”. New research in the time evolution of information systems, defined as physical systems containing information states within Shannon’s information theory framework, allowed the introduction of the second law of information dynamics (infodynamics). The data demonstrates that the second law of infodynamics requires the information entropy to remain constant or to decrease over time. This is exactly the opposite to the evolution of the physical entropy, as dictated by the second law of thermodynamics. The surprising result has massive implications for future developments in genomic research, evolutionary biology, computing, big data, physics and cosmology. 

Second law of Infodynamics and the simulation theory

The simulation hypothesis is a philosophical theory, in which the entire universe and our objective reality are just simulated constructs. Despite the lack of evidence, this idea is gaining traction in scientific circles as well as in the entertainment industry. Recent scientific developments in the field of information physics, such as the publication of the mass-energy-information equivalence principle, appear to support this possibility. In particular, the 2022 discovery of the second law of information dynamics (infodynamics) facilitates new and interesting research tools at the intersection between physics and information. In this work we re-examine the second law of infodynamics and its applicability to digital information, genetic information, atomic physics, mathematical symmetries, and cosmology, and we provide scientific evidence that appears to underpin the simulated universe hypothesis. 

The information capacity of the universe

Royalty free image from

The information capacity of the universe has been a topic of great debate since the 1970s and continues to stimulate multiple branches of physics research. Using Shannon’s information theory, the amount of encoded information in all the visible matter in the universe has been estimated. This was achieved by deriving a detailed formula estimating the total number of particles in the observable universe, known as the Eddington number, and by estimating the amount of information stored by each particle about itself. Each particle in the observable universe contains 1.509 bits of information and there are ∼6 × 1080 bits of information stored in all the matter particles of the observable universe. 

Digital information storage

Royalty free image from 

The introduction of digital data storage changed the way we produce, manipulate and store information. The transition point took place in 1996 when digital storage became more cost-effective for storing information than paper. Digital data storage technologies are very diverse. Most notable are magnetic storage (HDD, tape), optical discs (CD, DVD, Blu-Ray) and semiconductor memories (SSD, flash drive). Each type of memory is more useful to specific applications. Semiconductor memories are the preferred choice for portable electronics, optical storage is mostly used for movies, software and gaming, while magnetic data storage remains the dominant technology for high-capacity information storage, including personal computers and data servers. The recent growth in information production appears unstoppable. Each day on Earth we generate 500 million tweets, 294 billion emails, 4 million gigabytes of Facebook data, 65 billion WhatsApp messages and 720,000 hours of new content added daily on YouTube. In 2018, the total amount of data created, captured, copied and consumed in the world was 33 zettabytes (ZB) – the equivalent of 33 trillion gigabytes. This grew to 59ZB in 2020 and is predicted to reach a mind-boggling 175ZB by 2025. One zettabyte is 8,000,000,000,000,000,000,000 bits.

Read the full article here.

Entropic Barcoding Technology

The Entropic Barcoding Technology is the subject of a recent UK patent filed under a collaborative project involving IPI, UoP and a third party company (Patent Application Number: GB2304794.7, 31/03/2023). The proposed technology uses the concept of entropy of data in the framework of Shannon’s information theory to transform any data set (including digital) by simultaneously compressing its digital footprint size, irreversibly encrypting and labelling the data, while also facilitating a unique methodology for data integrity checking, data validation, fraud detection and fast identification via laser barcode scanning. The novel method of generating an Entropic Barcode of a dataset is based on the computation of the Shannon’s Information Entropy (IEs) of that dataset. The obtained Entropic Barcode is a true and unique representation of the information contained within the dataset, converted into a compressed numerical set that can be used for further data processing on its own, or as a graphical optically readable 2D barcode. The technology could be used to generate Entropic Barcodes of any kind of dataset, including genomic sequences and digital data files. We seek funding or collaborations to further develop the technology for commercialisation. For details, please contact us at: