Search This Blog

The Combined Impact of Managing Venezuelan Oil Resources and Rolling Back U.S. Motor Vehicle Emission Standards

  Introduction  Over the past few months, we have observed two significant developments.   If media reports are to be believed, the U.S. g...

Tuesday, February 17, 2026

The Combined Impact of Managing Venezuelan Oil Resources and Rolling Back U.S. Motor Vehicle Emission Standards

 Introduction 

Over the past few months, we have observed two significant developments.  If media reports are to be believed, the U.S. government has secured some form of management of Venezuelan oil resources.  Venezuela is said to have circa 300 billion barrels of oil resources, the largest known proven accumulation of such resources on the planet. However, some 70 -90% of this is regarded as "extra-heavy oil".

Extra-heavy oil is a form of crude oil that is exceptionally dense (i.e. API below 10 for those technically aware of this measurement standard), has a high viscosity (i.e. does not flow easily) and has usually a high sulphur content (which creates corrosion problems and is environmentally unfriendly).

The cost of producing this type of oil is considered relatively high. Various industry and academic sources quote this range of costs:

  • Lifting - costs of USD 15 - 25 / barrel: This is the operational cost to get the oil out of the ground. It is high as there is no natural flow in the reservoir.
  • Capital Expenditure of USD 10 – 25 / barrel: Upfront capital expenditure can be relatively high related to building steam-generation facilities, heaters etc. and drilling a large number of wells).
  • Transportation cost of USD 5 - 10 / barrel: Costs are relatively high as the oil must be heated and blended with diluents to flow. Also, due to high sulphur content, corrosion issues are expensive to manage.

When the above cost structure is coupled with the fact that such heavy oil is mostly sold at a discount to the Brent Benchmark, i.e. USD 2 – 10 / barrel discount, then the margins for a viable business begin to be difficult to achieve.

Recall, that a host government would also expect a share of the commercial pie by way of royalties and taxes. Currently, it is reported that Venezuela state oil royalty is 30 percent of revenue. So, if oil prices are at around the USD 60 / barrel mark for Venezuela heavy crude, and we assume that in future months, the 30 percent royalty is halved to assist a foreign investor’s investment case, then the total cost of producing Venezuelan crude could be conservatively estimated to be:

  • Lifting Cost of USD 15 / barrel,
  • Capital Expenditure of USD 12 / barrel,
  • Transport Cost of USD 5 / barrel,
  • Royalties (15 % at USD 60 / barrel oil price) of USD 9 / barrel

Estimate of Total Upstream Costs:           USD 41 / barrel

Royalties only become tangible when the oil is monetized so it is likely that an upstream investment case can be achieved through negotiations with the host government or resource owner.  The reality is that current levels of expected royalties would need to be reduced to make commercialization of this resource a viable investment for upstream players in the oil and gas industry and the Venezuelan government may be left with little choice in the matter.


What Happens Next?

Well, the oil has to be refined at a reasonable cost, and a market must be found for this oil. Venezuelan oil is heavy and “sour” (i.e. contains sulphur) and requires complex refining to make it easily marketable.

Against this backdrop, the European Union has been setting increasingly higher emissions standards (a series of regulatory limits that determine how much pollutants and greenhouse gases (GHGs) new vehicles can emit). They apply to vehicles sold and registered in EU member states, the European Economic Area (EEA), and often influence standards in other markets.

With motor vehicles being designed to accept cleaner fuels and emit less GHGs, coupled with marine vessels moving to more environmentally friendly fuel sources, aeroplanes pushing the use of Sustainable Aviation Fuels (SAFs) and China pursuing an aggressive Electric Vehicle strategy, the market for the large oil resource of Venezuela would be limited.

Thus, the relaxation of emission standards in a leading mega economy like the U.S. is a logical progression after achieving management of the large oil resource of Venezuela.

Under President Donald Trump, the U.S. Environmental Protection Agency (EPA) has taken major steps to repeal federal greenhouse-gas emissions standards for new motor vehicles — including undoing the 2009 Endangerment Finding and related vehicle GHG and fuel efficiency rules.

This matters because:

  • Those standards historically pushed for lower tailpipe emissions and greater fuel efficiency (e.g., via Corporate Average Fuel Economy regulations and greenhouse gas limits) - effectively reducing how much fossil fuel (petrol/diesel) could be burned per mile driven.
  • Rolling them back means less regulatory pressure to improve efficiency and curb CO₂ emissions from cars and trucks — the largest source of U.S. transportation emissions.

So, with weaker emissions standards:

  • vehicles can be less fuel-efficient;
  • gasoline and diesel demand can stay higher than under stricter standards; and
  • total greenhouse gas emissions from the transport sector can be higher.

 

Summary 

The Trump administration’s vehicle emissions rollbacks may increase U.S. gasoline / diesel consumption compared to continued stringent regulation.

  • Using heavier, more carbon-intensive Venezuelan crude as the source feedstock means that for every gallon burned, the total climate impact could be greater than if lighter oil or less oil overall were used.
  • Venezuelan crude likely be cheaper to refine for use in the U.S. land transportation.
  • The cost of equipment fitted in motor vehicles in the U.S. to reduce greenhouse gas emissions will likely reduce, making such cars cheaper to manufacturer.

It is likely that the rollback of emission standards in the U.S. will be challenged in U.S. courts at both state and Federal level. So, the situation remains in flux.

It is important to note that this interplay of Venezuelan oil and U.S. emission standards deregulation is not causal; i.e. U.S. emissions rules do not depend on Venezuelan oil, and Venezuelan oil availability does not regulate U.S. vehicle standards, but their combined effects will likely influence the total emissions green print of the energy and transportation sectors and the price of oil in the years to come.

 

 

 

 

Saturday, October 5, 2024

The Amazing Henriettas

 

        Henrietta Lacks (1st August 1920 to 4th October 1951)

When Henrietta Lacks arrived at the reception of the John Hopkins hospital in 1951 claiming she had a “knot” in a womb, she would not have known that some months later she would be dead. But if there has been one human being whose death has not been in vain, then it would be the death of this 31 year old African American woman.

Dr George Gey treated Henrietta and as part of his medical investigation took some tissue samples from her womb. To study Henrietta’s ailment, cells from the tissue were placed in a culture of nutrients but instead of dying quickly, as was expected, they grew exponentially, doubling their numbers every 24 hours. Indeed, these cells appeared to be immortal.

What is special about these immortal cells?

Henrietta Lacks' cells began what was the first, and, for many years, the only human cell line able to reproduce indefinitely. Her cells, known as “HeLa” cells (for Henrietta Lacks), remain a remarkably durable and prolific line of cells used in medical research around the world. Just over the past several decades, this cell line has contributed to many medical breakthroughs, specifically in the areas of the development of the polio and COVID-19 vaccines, the study of leukemia, the AIDS virus and cancer research globally.  HeLa cells have also been transported into outer space for research on the effects of zero gravity on human cells.

Suffice to say that millions of lives have been saved as a result of research enabled by HeLa cells. Henrietta Lacks (or her family) never received financial compensation for her immense contribution. Indeed, as is stated on the cover of the book that narrates her life story, “no dead woman has done more for the living…”

Henrietta Leavitt (4th July 1868 – 12th December 1921)

Henrietta Leavitt was an American astronomer. Her research and work on the properties of a class of stars called “Cepheid Variables” resulted in “Leavitt’s Law”. In essence Leavitt’s Law links the period and luminosity of the pulsating Cepheid Variables stars, effectively providing a method of measuring the vast distances of interstellar space.

Before the work of Leavitt and later (Edwin) Hubble, it was common belief that all the stars of the Universe were within a singular galaxy – our Milky Way. Leavitt’s effort later allowed Edwin Hubble to conclude that the Universe was made up of a multitude of galaxies, our galaxy, the Milky Way, being only one of an estimated 2 trillion galaxies that make up the observable Universe.

It has also since been estimated that these 2 trillion galaxies contain more stars (with potentially earth-like planets) than all the grains of beach sand on planet Earth.

On the evening of December 12, 1921, as 53-year old astronomer Henrietta Swan Leavitt succumbed to cancer, heavy rains fell from the skies over Cambridge, Massachusetts. 

Take a bow Henrietta Leavitt for some fundamental work that led humankind on a pathway to even more knowledge about the unfathomable scale and structure of our Universe. 

Both Henrietta Leavitt and Henrietta Lacks succumbed to cancer. Like the indelible crater on the lunar surface named after Henrietta Leavitt - The Leavitt Crater - both left lasting contributions to humankind. 

Saturday, July 29, 2023

The Economics of Climate Change

The Jevons Effect, the Khazzoom–Brookes Postulate and Recent IEA Assumptions – Is it time to Rewrite Economic Theory or Face Reality? 


In his book “The Coal Question” published in 1865, British economist William Jevons observed that England’s consumption of coal substantially increased after James Watt developed the Watt steam engine. The Watt engine was a more efficient version of Thomas Newcomen’s earlier design. Thanks to Watt’s improvements, coal became a more cost-effective source of energy leading to an increased use of the Watt engine design in a wide range of industries. With greater industrial application came a greater demand for coal causing Jevons to note; "It is a confusion of ideas to suppose that the economical use of fuel is equivalent to diminished consumption. The very contrary is the truth."


At the peak of the Industrial Revolution, many in Britain were concerned that the nation’s prized coal reserves were rapidly falling and some experts were of the opinion that improving technology would reduce coal consumption. Jevons argued that this view was incorrect as further increases in efficiency would tend to increase the use of coal. Hence, improving technology would tend to increase the rate at which Britan’s coal deposits, a finite resource, would be depleted, thus threatening the future energy security of the nation. Jevon’s thinking was later termed “the Jevons Effect”.

 

More Recent Research on the Jevons Effect

 

The Jevons Effect is probably the most widely known paradox when assessing environmental economics.  It is sometimes also called the rebound effect (or take-back effect) and it refers to the reduction in expected gains from new technologies (that increase the efficiency of use of a particular resource), because of behavioural or other systemic responses.

In more modern times (1980s), the then chief economist of the United Kingdom Energy Authority, Leonard Brookes revisited the Jevons Effect for the specific case of society's utilization of energy. Brookes argued that attempts to reduce energy consumption by increasing energy efficiency simply raised demand for energy in the economy as a whole. Independent research in the United States by Daniel Khazzoom reinforced the same hypothesis. In 1992, the economist Harry Saunders dubbed the work of Khazzoom and Brookes the “Khazzoom-Brookes Postulate”, a concept similar to that of the Jevons Effect. Saunders added to the existing research, stating that an increased level of energy efficiency has a tendency to increase energy consumption in two ways. First, it makes the use of energy relatively cheaper, thus encouraging increased use (the direct rebound effect). Secondly, increased energy efficiency increases real income and leads to increased economic growth, which then ramps up energy use for the whole economy.

 

Climate Change, the Jevons Effect and Why it Matters

 

In 2021, the International Energy Agency (IEA) published “Net Zero by 2050 – A Roadmap for the Global Energy Sector”. In this document, the content of which has been widely cited by politicians, media and others, it is projected that total energy supplied in the early 2020s of just over 600 Exajoules (EJ) will fall to 550 EJ in 2030, (i.e. 7% lower than in 2020). This is projected to occur despite significant increases in the global population (between about 2 and 3 billion people) because of a fall in energy intensity (the amount of energy used to generate a unit of Gross Domestic Product (GDP).  Figure 1 below, shows the IEA projections.


Based on its assumption of a projected fall in energy intensity, the IEA has, in the same publication, called for a halt in investments targeting the development of fossil fuel projects that have not been sanctioned as of 31 December 2021.

Consistent with the learnings from the Jevons Effect (more recently confirmed by the Khazzoom-Brookes Postulate), if energy is affordable (an objective of the Energy Trilemma – see Figure 3), then it is probable that human behaviour will adjust to demand energy at least the levels seen currently. Thus, the critical assumption made by the IEA in their 2021 document proposing pathways to a net zero world, may require review.

If demand is unlikely to drop as projected in the coming decades (Figure 1) then there will be a shortfall of supply causing whatever available primary energy sources to increase in price leaving those least able to afford higher cost levels, the most impacted.

Is such a scenario a theoretical aberration or could it manifest into a likely reality?

 

The Learnings of 2022

 

When conflict arose between Russia and the Ukraine in early 2022, Western Europe was faced with an energy security threat. At the initiation of conflict, Russia provided Western Europe with 35 percent of its gas supply. Curtailment of this supply as a punitive measure against the Russians without alternative replacement also meant that Western Europeans would face a starvation of energy. With political and economic chaos on the horizon and winter only a few months away, European governments were forced to urgently intervene in the Liquified Natural Gas (LNG) markets to mitigate a potential energy shortage in the European Union (EU).  LNG prices spiked as cargoes meant for the emerging economies from Thailand to Pakistan and other countries in Africa were redirected to European ports. Even more glaring, after several wealthy governments of first world nations made this energy grab, they then subsidised the actual cost of energy delivered to their citizens to protect the levels and quality of their livelihoods and minimally impacting economic activity.


This was not the case for those in the emerging economies. In the poorer countries, economic activity and livelihoods at many levels were impacted as energy prices soared and governments were unable to fulfil a fundamental obligation of delivering affordable and available energy.

 


Figure 2 shows price movements in EU Gas Import Prices. Clearly the response of the EU to the Russian / Ukraine war significantly impacted energy prices globally in mid 2022.

If we project forward to a 2050 world in which there:

·       will be 2 – 3 billion additional people, each striving to increase their standard of living to that of their European or American fellow global citizens, performing energy intensive activities;

·       would have been a prior period of continuous under-investment in traditional energy delivery systems; and

·       if the economic theories advanced by Jevons and more recently, Khazzoom-Brookes or Saunders have a basis,

then are we more likely to further increase the energy divide between the rich and the poor nations in the future.  The query then that must be raised is how this highly probable scenario (based on economic theory and real responses in 2022) is consistent with the much-touted UN objective of a “just transition”?

 

The Energy Trilemma – A Dream for Emerging Economies?

 

The Energy Trilemma (Figure 3) refers to the need to find balance between energy reliability/security, affordability and sustainability, and its impact on everyday lives.

 


Empirical evidence has clearly demonstrated that when first world nations face energy reliability / security concerns, as was the case in 2022 in Europe, noble global energy equity rights are quickly cast aside.  Instead, a desperate “grab” transpires resulting in price spikes until the richer nations replenish the storage facilities that will be evacuated to quench their energy thirst.

Indeed, in 2022 in the wake of the Russian-Ukrainian conflict and staring a winter of energy shortage, the European Union even modified the definitions of its green taxonomy urgently permitting the inclusion of nuclear and gas as “green” energy resources. Clearly it was a necessary step to avert a continental scale energy starvation catastrophe.

So, what is the solution?

The solution remains the same. A transition into a cleaner environment must be orderly and measured. It must be based on real, available and applicable technologies lest this accelerated journey leads to an alternative scenario; “A Probable Evolution into a Future Significant Energy Divide” is shown in Figure 4.



COP 28 – An Opportunity to Balance the Narrative


A “Probable Future Significant Energy Divide” is evolving. Emerging economies should not allow the learnings from 2022 go to waste whilst developed nations must understand that unless the emerging economies also chart a path to net zero, there is no viable, holistic climate change agenda for the planet.

When COP 27, held in the Egyptian resort of Sharm El-Sheikh between 6th November and 18th November 2022 concluded, one of the headline outcomes was the reaching of an agreement to compensate nations for loss and damage caused by climate change through the establishment of a fund. The new Loss and Damage Fund (LDF) agreed at COP27 represented a long and hard-fought win for small and vulnerable nation states and an important step towards climate justice. Loss and damage, in its broadest definition, encompasses all the negative impacts of climate change including extreme weather events like hurricanes and flooding, and “slow onset events” like rising temperatures and ocean acidification.

The sources of funds to underwrite “loss and damage” was left for COP 28 to formulate and finalize.

Whilst the LDF would be of great assistance to emerging economies in the aftermath of a climate related catastrophe, what should be the focus of a balanced narrative at COP 28 is the need to avert further evolution of a probable Significant Future Energy Divide. This can only be achieved if the principles of the Energy Trilemma as shown in Figure 3 are respected and vigorously pursued.

For now, the foundation of the Divide has been laid and if advanced further, will lead to painful outcomes for the poorest, in the not-too- distant future. It is the least developed nations which have contributed minimally to the climate change phenomenon being experienced today. Yet it is these very nations which are being asked to also carry a burden of urgently meeting temperature targets to avert a global climate disaster.

A Practical Way Ahead

To ensure long term availability, legacy sources of energy cannot be excluded from the energy mix. Instead, such sources should continue to be developed, with a priority on decarbonization as much as practicable, until projections for a net zero world need not rely on reductions in energy demand based on efficiencies and that defy economic theory such as the Jevons Effect. A sustainable economic future of the emerging economies will not be defined by reactively securing compensation from the LDF. Instead, it should be founded on proactively protecting adequate amounts of future energy supply to underpin stable and increasing levels of economic growth. Thus, emerging economies should use COP 28 to ensure the principles being balanced through the Energy Trilemma are strictly respected now and in the future by all so that everyone will have equal access in the future to what is likely to be a scare resource.

 

 End of Post 





Monday, October 24, 2022

The Enigma of Blackholes and the Master of Rubik’s Cube

Enigma – Synonymous with the Name of Turing

The word “enigma” refers to something or someone that is, or, who is, mysterious or difficult to understand. The Enigma Machine was thus, aptly named. By the beginning of the twentieth-century it had become necessary to mechanize encryption and in 1918 a German engineer, Arthur Scherbius, patented the Enigma Machine – a solution to fulfil the growing need for encryption.   Originally this device was sold to banks, railway companies and other organizations that needed to communicate secret information. By the mid-1920s the German military had also started to use the Enigma Machine, with some technical variation from the commercial version of the device.

By World War II, this machine was at the very heart of encryption techniques applied to many thousands of highly sensitive coded messages transmitted by the German Armed Forces each day. These messages ranged from top-level signals, such as detailed situation reports prepared by generals at the battle fronts, the movement of troops and orders signed by Hitler himself, to even the important minutiae of war such as weather reports and the inventories of the contents of supply ships. Being able to access the information that was being transmitted was not very difficult as these messages were transmitted as radio signals and were easily detected. The critical step was to interpret the true substance of these encrypted messages and this could only be done if they were first decoded.


The Enigma Machine

Essentially, the Enigma Machine is just a large circuit. When you type in a letter on the Enigma Machine it completes a circuit and lights up another letter on a lamp-board. The circuit comprises plugboards and several rotors. By configuring the plugboards and rotors in a specific manner, an input letter being typed-in would result in a different output letter of the alphabet being produced. The permutations allowed by the plugboards and rotors enabled encrypted combinations of words into the billions to be created. In the World War II scenario, the level of complication was further compounded as each day the German Armed Forces would change the relative positions of the plugboard connections and on alternate days, the rotor orientations as well, with only sender and receiver knowing specific plugboard / rotor positions applicable for that day. Thus, the task of deciphering these messages, for the purposes of Allied military intelligence gathering, was mammoth.


Polish Efforts to Break Enigma

Enigma was used by the German military throughout World War II. Therefore, breaking the Enigma cipher became a top priority, first by the Polish, then later, by the British and Americans.  Polish Intelligence initially tried to break the German Enigma using conventional code-breaking techniques but to no avail. Driven by the imperative of trying to mitigate ever-threatening German military tactical moves, they, uniquely among other nations at that time, decided to try a mathematical approach. In 1932 a team of young mathematicians was set up. It included Jerzy Rozycki, Henryk Zygalski and Marian Rejewski. A great deal of the foundation work towards the eventual breaking of the Enigma code (from a mathematical perspective), was undertaken by this team of Polish researchers.


Breaking Enigma During World War II

On 1 September 1939, Germany invaded Poland. World War II was now underway. It was now even more important to crack Enigma.


Communications to the German battlefront were done via coded radio messages. The encryption and decryption of the coded messages were done by the Enigma Machine

On 4 September 1939, the day after Britain declared war on Germany, Alan Mathison Turing, a young English mathematician, computer scientist, and logician reported to Bletchley Park, the wartime station of the Government Code and Cipher School (GCCS). Turing, an individual of immense intellect, was to become instrumental in the battle to decrypt messages generated by Germany using Enigma. An alumnus of Cambridge University and Princeton University, he initially helped adapt a device originally developed by the Polish researchers, to create the “bombe”. The prototype model of his anti-Enigma "bombe", named “Victory”, was installed in the spring of 1940 as he pitted machine against machine to deliver results.


First Day Cover from the United Kingdom showing the "Bombes" and honouring Alan Turing
But the evolution of encryption in Germany was not static and it was left to Turing to discover a way to break into the vast torrent of messages suddenly emanating from a new, and much more sophisticated Nazi cipher machine. The British named the new machine “Tunny” and Tunny was the heart that pumped secure messages through the communication arteries that connected Hitler and the Military High Command in Berlin to the generals on the frontlines. Turing's breakthrough in 1942 yielded the first systematic method for cracking Tunny messages. His method was simply known at Bletchley Park as “Turingery” and the broken Tunny messages provided detailed knowledge of German strategy - information that changed the course of the war.

As early as 1943 Turing's machines were cracking a staggering total of 84,000 Enigma messages each month - two messages every minute. Turing also personally broke a form of Enigma used by the Nazi U-boats preying on the North Atlantic merchant navy convoys. It was a crucial contribution. These convoys set out from North American ports loaded with vast cargoes of essential war supplies such as ammunition, fuel, food and even troops destined for the front lines. But they never got to their British destination ports. Instead, Nazi U-boats torpedoed and sank so many of these ships that Churchill's analysts said Britain would soon be starving and predicted the war could be lost.


The Close Call  

"The only thing that ever really frightened me during the war was the U-boat peril," Churchill would later divulge.


Stamp of Sir Winston Churchill from the Island of Jersey
Just in time, Turing and his group succeeded in cracking the U-boats' coded communications to their High Command in Berlin. With the co-ordinates and intentions of the U-boats revealed, Allied convoys could avoid this submarine menace in the vast Atlantic Ocean.

Turingery was the seed for the sophisticated Tunny-cracking algorithms that were incorporated in Tommy Flowers' Colossus, the first large-scale electronic computer. Ten such computers were built by the end of the war and Bletchley Park became the world's first electronic computing facility.

Turing's work on Tunny was the third of the three strokes of genius that he contributed to the attack on Germany's codes, along with designing the bombe and unravelling the U-boat Enigma. 

It is estimated that Turing’s work reduced the duration of the war by about two years, thus saving countless lives.  Because of the highly classified nature of his work, Turing was never accorded much recognition for his wartime contributions. Sadly, he was eventually disgraced by the Government he served loyally.


Chemical Castration


In 1952, it was discovered that Alan Turing had started a relationship with Arnold Murray, a 19-year-old unemployed man. During an investigation, triggered by a burglary at his residence, Turing acknowledged a sexual relationship with Murray. Homosexual acts were criminal offences in the United Kingdom at that time and both men were charged accordingly. Turing, on the advice of his brother and his own solicitor entered a plea of guilty.  The case, Regina v. Turing and Murray was brought to trial on 31 March 1952. Turing was convicted and given a choice between imprisonment and probation. His probation would be conditional on his agreement to undergo hormonal physical changes designed to reduce libido, a process now known as "chemical castration”. This harsh treatment rendered Turing impotent. 

Turing's conviction led to the removal of his security clearance and barred him from continuing with his cryptographic consultancy for the Government Communications Headquarters. Sadly, on 8 June 1954, Alan Turing's housekeeper found him dead at his home at 43 Adlington Road, Wilmslow, Cheshire. He had died the previous day. He was 41 years of age. An inquest determined that he had committed suicide through the ingestion of cyanide.

A person, whose work has been estimated to have saved approximately fourteen million lives, in the end, could not save his own. Society had its own codes and according to the law of the day, Alan Turing had broken these codes to his own, eventually fatal, detriment.


Black Holes – Synonymous with the Name of Hawking

For many years, Black Holes were an astronomical enigma.  Initially it was John Michell (25 December 1724 – 21 April 1793), an English natural philosopher and clergyman who provided pioneering insights into a wide range of scientific fields including astronomy and gravitation.  Whilst he shares the same birthday as Sir Isaac Newton, and having been born a year before Newton’s death, Michell also studied at the University of Cambridge and is considered as "one of the greatest unsung scientists of all time". In fact, he is the first person known to have proposed the existence of black holes and the first to have suggested that earthquakes travelled in seismic waves.

The Postulation of John Michell and the Findings of Einstein

Michell postulated the idea that if a star was massive enough, the velocity required to escape the force of its gravity might need to be greater than the speed of light. If this was the case, then the star would be a “dark star” as no light could escape it. More than a century later, in 1916, Albert Einstein, one of the Fathers of Black Holes, predicted the potential existence of these black holes with his general theory of relativity and in 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterize a Black Hole. 


What is a Black Hole?

A black hole is so named as it is a region of spacetime where gravity is so strong that nothing, not even particles or even electromagnetic radiation such as light, can escape it. The boundary of no escape that envelopes a black hole is called the event horizon. The event horizon is an information membrane that partitions those places of the Universe that we can see, and it also serves to sequester those places of the Universe of which we cannot have sight. Once across the perimeter of the event horizon, an object is absorbed through the pull of its’ immense gravity, into the black hole.

Black holes were long considered a mathematical curiosity and it was not until the 1960s that theoretical research work showed they were a generic prediction of general relativity. It was only in 1967 that the term "Black Hole" was coined by American astronomer John Wheeler.


Stamp from the Republic of Madagascar honouring John Wheeler  who coined the phrase "Black Hole"
Much research has recently been done in the field of black holes. Today we know that there is a black hole at the centre of our own Milky Way Galaxy. We also know how stellar black holes are formed and in 2019, the Event Horizon Telescope, or EHT produced the first image of a black hole. That object sits at the center of the M87 galaxy, about 55 million light-years from Earth. More recently, a constructed image of the black hole at the center of our galaxy was also developed and widely publicized. Again, it was the Event Horizon Telescope (EHT) Collaboration that delivered this image by combining images extracted from many EHT observations and creating a single image of the supermassive black hole at the center of our galaxy, called Sagittarius A*.

Stephen Hawking

The other Father of Black Holes is the iconic British scientist, Stephen Hawking. In 1971, Hawking derived his black hole theorem from Einstein’s theory of general relatively. This theorem states that it is impossible for the surface area of a black hole to decrease over time.



The Fathers of Black Holes - Albert Einstein and Stephen Hawking  featured together on this First Day Cover from the Isle of Man. The reverse side of the FDC features a message from Professor Hawking.

To test out Hawking’s theory, researchers analysed gravitational waves, or ripples in the fabric of space-time, created 1.3 billion years ago by two behemoth black holes as they spiralled towards each other at high speed. These were the first gravitational waves ever detected in 2015 by the Advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), a laser beam that splits into two 2.485-mile-long (4 kilometer) paths, and is capable of detecting the slightest distortions in space-time. By splitting the received signal into two halves — before and after the black holes merged — the researchers calculated the mass and spin of both the two original black holes and compared it with the new combined one. These numbers, in turn, allowed them to calculate the surface area of each black hole before and after the collision. 

The test showed that the surface area of the newly created black hole was greater than that of the initial two combined, confirming Hawking's area law with a more than 95% level of confidence.

Having developed his black hole area theory from the perspective of general relativity, Hawking then applied his mind to black holes from the angle of quantum mechanics. From his work, a concept known as Hawking Radiation emerged — where fogs of particles are believed to be emitted at the edges of black holes through quantum effects (based on Heisenberg’s Uncertainty Principle). Hawking predicted that this quantum phenomenon would lead black holes to gradually shrink and eventually, over a period of time several times longer than the age of the Universe, cause them to completely evaporate. This evaporation through radiation is thought to occur over timescales long enough so as not violate the area law in the short term. But nonetheless there was a disconnect and this contradiction in the findings gave rise to more research and the concept of the black hole information paradox emerged. This concept comes into play when the predictions of general relativity and quantum mechanics are combined.


Special Cover from India honouring the work of Stephen Hawking 
So, on the one hand, according to Hawking’s black hole theory (which is based on the general relativity), the surface area of a black hole should always be preserved but deferring to his own findings from a quantum mechanics perspective, over a long enough time period, the effects of Hawking Radiation would cause the black hole to evaporate, thus causing its surface area to reduce with his theories relating to the information paradox being a possible bridge that attempted to close the gap.

Stephen William Hawking is widely respected as one of the great theoretical physicists of the modern era. Between 1979 and 2009, he was the Lucasian Professor of Mathematics at the University of Cambridge, widely viewed as one of the most prestigious academic posts in the world, a post that had once been held by Sir Isaac Newton. 


The Onset of Amyotrophic Lateral Sclerosis

In 1963, at age 21, Hawking was diagnosed with an early-onset of a slow-progressing form of motor neuron disease (Amyotrophic Lateral Sclerosis – ALS, for short) that gradually, over the decades, paralyzed him. After the loss of his voice, he communicated through a speech - generating device initially through use of a handheld switch and eventually, activated by using a single cheek muscle. Even though he was severely medically disadvantaged, Stephen Hawking realized that the outcomes of the laws of physics were visible for all to see but the laws themselves had not been completely discovered. And he worked relentlessly to define the laws that produced the observable outcomes.

 

Speedcubing – Today, Synonymous with the Name of Max Park

The Rubik Cube was a puzzle cum toy invented by an Hungarian architecture professor named Erno Rubik. The toy first became popular in the early 1980s and as its inventor, Rubik was the first person to solve the Cube. Apparently, Rubik spent a month struggling to unscramble it, if only to prove to himself that it could be done. Over the years, the time required to unscramble this cubic conundrum has reduced and global competitions are held to determine if new records may be created in this space.

According to information to be found in Wikipedia, Max Park was born on November 28, 2001, in California. When Park was two years old, he was diagnosed with moderate to severe autism. His parents, Miki and Schwan Park, were advised that he might need lifelong care. Max Park's motor skills were severely impaired because of his autism so his mother, Miki Park, taught Max how to solve a Rubik’s Cube. He began learning "speedcubing" and soon was performing at competitions. At his second competition, he came in at first place in the 6×6×6 event.


Stamp commemorating the 1982 World Rubik Cube Tournament held in Budapest, Hungary

Over the years, Park has set multiple world records in solving the 4×4×4, 5×5×5, 6×6×6, and 7×7×7 cubes, and 3×3×3 one-handed. He has won 374 events across many Rubik's cube competitions.  

Apparently, an average person takes more than 3 hours to solve the cube on a first try. There are algorithms that may be learnt to help solve the cube faster. Like some other Rubik's cube solving methods, one can solve the cube with a two-look system (two algorithms) or a one-look system (one algorithm). The two-look system has 20 potential algorithms to be learned, while the one-look system has a whopping 493 potential algorithms. Hand and finger dexterity are also required if one intends to compete. 

Incredibly, Max Park currently ties the world record average for 3x3x3 of 4.86 seconds with Tymon Kolasinski.

Concluding Comments

Alan Turing was an outlier who was finally constrained by the wishes and rules of the general population. Here was an individual who may well have changed the course of history through his wartime contributions but was eventually branded a criminal. 

But the words on a plaque set below Turing’s statue currently located in Sackville Park, Manchester, begins to set the record straight, referring to him as a “Victim of Prejudice”. As values changed over many decades, it became necessary to act. In 2014, Queen Elizabeth II officially pronounced Turing pardoned. The Queen's action was only the fourth royal pardon granted since the conclusion of the Second World War. Pardons are normally granted only when the person is technically innocent and a request has been made by the family or another interested party. In the case of Turing's conviction, neither condition was met.

In September 2016, the government announced its intention to expand this retroactive exoneration to other men convicted of similar historical indecency offences, in what is described as the Alan Turing Law. 

The Alan Turing Law is now an informal term for the law in the United Kingdom, contained in the Policing and Crime Act 2107, which serves as an amnesty law to retroactively pardon men who were cautioned or convicted under historical legislation that outlawed homosexual acts. The law applies in England and Wales. The Bank of England further honoured Alan Turing when his image was included as part of a new STG 50 note which went into circulation in 2021, 67 years after his premature death.

Stephen Hawking lived with ALS for over 50 years. Whilst his body deteriorated, his mind continued to evolve, and he increasingly challenged himself to resolve the most complex problems of modern physics which try to explain the biggest question of all – how was the Universe created?

He once said from the confines of a wheelchair that transported his frail body for decades:


“We are just an advanced breed of monkeys inhabiting a very minor planet, rotating around a very average star. But we can understand the Universe. That makes us something special.”


As for Max Park, surely, he is the living example of a quote from the movie “The Imitation Game” which tells the story of the life and work of Alan Turing. The quote goes like this:


 “Sometimes it’s the people no one imagines anything of, who do the things that no one can imagine.” 


The hallmarks of greatness and genius are to make those things which are difficult to do appear simple.  Turing, Hawking and even Park – who would have ever imagined their achievements given their personal challenges? 

In the case of Turing, he may well have changed the course of history.

Stephen Hawking has helped us understand some elements of our creation. 

And Max Park? 

Max is most definitely an inspiration to the parents of  children who are medically classified as "autistic" through his redefinition of this sometimes misunderstood word. 

With that, I hope you are inspired by the music of Sia. The song is entitled "Never Give Up".

https://www.youtube.com/watch?v=NEsQfb4xWY0


All stamps and First Day Covers featured in this post are from my personal collection.