**Research**

**Mass-Energy-Information equivalence principle**

**Mass-Energy-Information equivalence principle**

Landauer’s principle formulated in 1961 states that logical irreversibility implies physical irreversibility and demonstrated that information is physical. A new of mass-energy-information (M-E-I) equivalence principle proposes that a bit of information is not just physical, as already demonstrated, but it has a finite and quantifiable mass while it stores information. The M-E-I equivalence principle proposed in 2019 is an extension of Landauer’s principle stating that, if information is equivalent to energy, according to Landauer, and if energy is equivalent to mass, according to Einstein’s special relativity, then the triad of mass, energy and information must all be equivalent, too (i.e. if M = E and E = I, then M = E = I). The M-E-I equivalence principle generated a number of interesting ramifications in physics including the concept of information being the fifth state of matter in the universe and possibly the missing dark matter. In this framework, it was shown that the mass of a bit of information at room temperature (300K) is 3.19 x 10^ -38 Kg.

**Testing the fifth state of matter****, simulation hypothesis and the M-E-I principle**

**Testing the fifth state of matter**

**, simulation hypothesis and the M-E-I principle**

The mass-energy-information equivalence principle and the information content of the observable matter in the universe, represent two important conjectures, called the information conjectures. Combining information theory and physical principles of thermodynamics, these theoretical proposals made specific predictions about the mass of information, as well as the most probable information content per elementary particle. This experimental protocol allows empirical verification of the information conjectures, by confirming the predicted information content of elementary particles. The experiment involves a matter – antimatter annihilation process. When electron – positron annihilates, in addition to the two 511 keV gamma photons resulting from the conversion of their rest masses into energy, we predict that two additional low energy photons should be detected, resulting from their information content erasure. At room temperature, a positron – electron annihilation should produce two ~50 mm wavelength infrared photons due to the information erasure. This experiment could therefore confirm both information conjectures and the existence of information as the 5^{th} state of matter in the universe.

**Information catastrophe - power requirements singularity **

**Information catastrophe - power requirements singularity**

Currently we produce ~10^{21} digital bits of information annually on Earth. Assuming 20% annual growth rate, we estimate that ~350 years from now, the number of bits produced will exceed the number of all atoms on Earth, ~10^{50}. After ~246 years, the power required to sustain this digital production will exceed 18.5 TW, i.e. the total planetary power consumption today, and ~500 years from now the digital content will account for more than half of the Earth’s mass, according to the mass-energy-information equivalence principle. Besides the existing global challenges, our estimates point to another singularity event for our planet, called the Information Catastrophe.

**Information theory used to detect and predict ****genetic mutations**

**Information theory used to detect and predict**

**genetic mutations**

The current scientific consensus is that genetic mutations are random processes. According to the Darwinian theory of evolution, only natural selection determines which mutations are beneficial in the course of evolution, and there is no deterministic correlation between any parameter and the probability that these mutations will occur or not. Investigating RNA genetic sequences of the SARS-CoV-2 virus using Shannon’s information theory a previously unobserved relationship between the information entropy of genomes and their mutation dynamics has been observed. This allowed the formulation of a governing law of genetic mutations, stating that genomes undergo genetic mutations over time driven by a tendency to reduce their overall information entropy, challenging the existing Darwinian paradigm. This is a paradigm shift in genetic technologies, as it opens up to developing a deterministic way of predicting genetics mutations before they occur. It has massive implication for pandemic monitoring, virology, cancer research, genetic therapies, evolution theory, etc

https://www.mdpi.com/2076-3417/12/14/6912

https://www.sciencedirect.com/science/article/abs/pii/S0378437121006567?via%3Dihub

**Second law of Information Dynamic****s**

**Second law of Information Dynamic**

**s**

One of the most powerful laws in physics is the second law of thermodynamics, which states that the entropy of any isolated system remains constant or increases over time. In fact, the second law is applicable to the evolution of the entire universe and Clausius stated, “The entropy of the universe tends to a maximum”. New research in the time evolution of information systems, defined as physical systems containing information states within Shannon’s information theory framework, allowed the introduction of the second law of information dynamics (infodynamics). The data demonstrates that the second law of infodynamics requires the information entropy to remain constant or to decrease over time. This is exactly the opposite to the evolution of the physical entropy, as dictated by the second law of thermodynamics. The surprising result has massive implications for future developments in genomic research, evolutionary biology, computing, big data, physics and cosmology.

**T****he information capacity of the universe**

**T**

**he information capacity of the universe**

*R**oyalty free image from* pixabay.com

The information capacity of the universe has been a topic of great debate since the 1970s and continues to stimulate multiple branches of physics research. Using Shannon’s information theory, the amount of encoded information in all the visible matter in the universe has been estimated. This was achieved by deriving a detailed formula estimating the total number of particles in the observable universe, known as the Eddington number, and by estimating the amount of information stored by each particle about itself. Each particle in the observable universe contains ∼1.509 bits of information and there are ∼6 × 10^{80} bits of information stored in all the matter particles of the observable universe.

**Digital ****information ****storage**

**Digital**

**information**

**storage**

*Royalty free image from *Pexels.com

The introduction of digital data storage changed the way we produce, manipulate and store information. The transition point took place in 1996 when digital storage became more cost-effective for storing information than paper. Digital data storage technologies are very diverse. Most notable are magnetic storage (HDD, tape), optical discs (CD, DVD, Blu-Ray) and semiconductor memories (SSD, flash drive). Each type of memory is more useful to specific applications. Semiconductor memories are the preferred choice for portable electronics, optical storage is mostly used for movies, software and gaming, while magnetic data storage remains the dominant technology for high-capacity information storage, including personal computers and data servers. The recent growth in information production appears unstoppable. Each day on Earth we generate 500 million tweets, 294 billion emails, 4 million gigabytes of Facebook data, 65 billion WhatsApp messages and 720,000 hours of new content added daily on YouTube. In 2018, the total amount of data created, captured, copied and consumed in the world was 33 zettabytes (ZB) – the equivalent of 33 trillion gigabytes. This grew to 59ZB in 2020 and is predicted to reach a mind-boggling 175ZB by 2025. One zettabyte is 8,000,000,000,000,000,000,000 bits.