CHRONOSPHERE » Economics http://chronopause.com A revolution in time. Fri, 03 Aug 2012 22:34:48 +0000 en-US hourly 1 http://wordpress.org/?v=3.5.1 ii Mirror mirror hanging on the wall, CryoX: Birth of NeoInsurgent Cryonicst http://chronopause.com/index.php/2012/03/12/ii-mirror-mirror-hanging-on-the-wall-cryox-birth-of-neoinsurgent-cryonicst/ http://chronopause.com/index.php/2012/03/12/ii-mirror-mirror-hanging-on-the-wall-cryox-birth-of-neoinsurgent-cryonicst/#comments Mon, 12 Mar 2012 23:34:51 +0000 chronopause http://chronopause.com/?p=1449 Continue reading ]]>

By CryoX

{This is a work of fiction  {or is it?}

Mirror mirror hanging on the wall
You don’t have to tell me who’s the biggest fool of all
Mirror mirror I wish you could lie to me
And bring my baby back, bring my baby back to me – m2m

My frequent flier card isn’t a card at all, it’s Parthenocissus tricuspidata (some would argue it’ the Roman numeral IV, instead). Whatever. For me it’s the magic weed that evaporates the financial distance between the coasts three or four time a year. Most of my frat buddies have their business junkets, we academics have our conferences. Alcor and Mike Darwin. Both on the West Coast, as  was my upcoming conference. Doable.

I hadn’t seen Max More since my undergraduate days, which I realized were rapidly becoming, no pun intended, a chillingly long time ago. My girlfriend (at the time) and I had attended some cryo/extro/CR get-together’s, and I met Max and his wife Natasha several times.  Max was this earnest, muscular, ginger, intellectual type who tried just a little too hard, was just a little too rehearsed and was more than a little too rigid. His wife Natasha? In some slightly different AU, Kurzweil has his Ramona. To me there is something artificial, slightly off and s-t-r-e-t-c-h-e-d t-a-u-t about her.  The only time I met Max without her around, I noticed a big difference in him; he was visibly insecure.

Now, Max More is President of Alcor.

I should have called to be sure Max was going to be there instead of just booking for the tour. Stupid. My flight was delayed out of LAX, and with the crazy delay from the limo, I barely made it from Sky Harbor to the Alcor building in time to meet the rest of the group. Unbelievably, the traffic in Phoenix is worse than it is in L.A.

The Alcor building is drab and unimpressive which, because of the idiodyssey of my limo driver, I really don’t understand. There are two Acoma Drives in Scottsdale and the moron (or his company) driving me from the airport had no GPS. We spent half an hour cruising around the Scottsdale Air Park before I finally became desperate enough to shove my Droid in his face and demand he call someone for instructions (shame on me for not having my GPS enabled for travel). The Air Park has lots of architecturally attractive buildings – some quite stylish if you like that Frank Lloyd Desert Look. The Alcor building is Brutalist Bad; plain-ugly-anywhere.

As soon as we were admitted to the lobby/reception area, a bomb went off in my head: Natasha! I don’t know if she had anything to do with it, but that was my reaction.  That kind of space is, by definition, supposed to welcome and draw you in. Instead, there is this big, cold, crystalline blob in the form of an “Infinity Mirror” almost immediately inside the door on the wall to your right, as you walk in.

There are all kinds of problems with this. First, it causes a distraction. The visitors aren’t interacting or socializing with each other, or the Alcor staffer (who should be a scantily clad voluptuous blonde). Instead, they are looking at the “pretty” on the wall, and some of them are even ape-touching it. One Merkeley woman in the group poked me in the ribs and said in an excited whisper, “Oh look into it, look into it.” That was my undoing. Fun-house mirrors, looking down tall glass buildings, certain angles at the Las Vegas  strip: all provoke intense, uncontrollable vertigo and nausea. Instantly, I was an undergrad in a dorm room staring up at an empty case of Dos Equis from the floor.  In one direction was the door to the outside, which the lady who had let us in had locked with a key. In the other direction was a mass of sharp angled stainless steel and glass furniture which I could see myself impaled upon and dying in a pool of my own blood and vomit.  I was paralyzed in front of the magic mirror. All I could do was shut my eyes and think of cool sea breezes. It worked.

The Alcor reception area is done up in grays, icy whites and shiny metals. This is a cryonics company. Its two most obvious and predominant negative images to overcome are death and the cold.  I didn’t really need the rest of the tour because even before the nausea had fully subsided, I realized that the special expertise Max had been hired to ply on Alcor was a new, high technology “preservative” skill called techsodermy, which is the cryonics equivalent for “dead” high technology companies. It was invented in the 1980s in Silicon Valley, and while I just made the analogy to cryonics, it really owes it origins more to taxidermy, because it was invented in order to fill dead tech companies with fluff in the hopes of convincing someone to buy them. (When we were waiting for our rides, the Merkeley Lady said the lobby reminded her of Benihana, and that she expected an “Oriental gentleman” with sizzling liquid nitrogen and  steak and shrimp to come out and start “chopping our meal” with a Ginzu knife at any moment. At least, she hoped it was steak and shrimp.)

My Old Man is all about money. In fact, he is money. He makes money appear and disappear. He moves money. He cleans it, he packages it, he inventories it, he “handles” it. That means that his clients are, mostly, people who rarely, if ever touch the filthy stuff. Some of them don’t even want to touch the little pieces of plastic that serve as markers for it. It’s an irony that the people who have the most money are the most visibly invisible of the super rich. If there is anyone reading this who knows what a Smythsons Diary is, I’d be very surprised. Perhaps a few more would know how to assess a man’s station by looking at his shoes, or his writing utensil? Today, casual dress is so commonplace and so comfortable…and if you want to be somewhere reasonably economically and you have commonsense and a lot of money, you book first class and you dress sensibly and comfortably. But, if you are in the know – then you know who’s who, and you don’t need a ledger book to tell you.

If you want peace and privacy, then you don’t travel by commercial means at all. That’s for the peasants. You use Flight Centres and privates jets, and there is no security screening. And if you want a blow job or a massage, or both en route, that can be arranged for a few hundred dollars more; a small part of the cost of coach ticket the flying public pays, and that after taking off their shoes and belts and switching planes in Houston and Dallas.

The people at Alcor are clueless about how to get the customers that matter. Not just the rich and the super rich (the people my Old Man services day-in and day-out), but the “good-judgment” segment of every demographic of the population. You may be a working class stiff from Boston in a cloth coat, but you know what the genuine trappings of quality, durability and class are, regardless of the style. Warmth, wealth, style, elegance, quality; whether understated or overstated, they always come through. So does Costco warehouse gray.

My Old Man wanted me to get an M.B.A. But he wasn’t altogether disappointed that instead of the usual frequent flier card I got that Mark IV. He’s interested in cryonics and he thinks it has a technical and (less so) a financial chance of working. But Alcor? I may be that desperate, but unfortunately for me (and him), he’s not.

]]>
http://chronopause.com/index.php/2012/03/12/ii-mirror-mirror-hanging-on-the-wall-cryox-birth-of-neoinsurgent-cryonicst/feed/ 8
Fucked. http://chronopause.com/index.php/2011/08/09/fucked/ http://chronopause.com/index.php/2011/08/09/fucked/#comments Tue, 09 Aug 2011 09:00:00 +0000 admin http://chronopause.com/?p=1118 Continue reading ]]>
By Mike Darwin

Have I got your attention now?

Good.

Most people say my writing here is far too long and not nearly to the point. Today I’ll remedy that. [Though you’ll still have to read this http://wp.me/p1sGcr-1h for what I going to say here to have much credibility. Read, read it carefully, and note when the contents were first published on-line.]

A couple of readers have also noted that I “seem to be in a hurry” with whatever my agenda is. Today, in part, I’ll explain why.

Over the past few days the true state of the global economy has started to be unveiled. It is going to get a lot worse. I’m no prophet or seer; would that I were. Because then, I could quantify it all for you and spoil the ending by telling you how it’s going to turn out; for you, for me, and for everyone else on the planet. But the truth is, I have no crystal ball and no metaphysical “inside track” on the future. I had hoped, fervently, that I might have some more time, that we might have some more time – perhaps as much as a year or two, before this global economic decompensation occurred. Well, no such luck. What is happening now is the beginning of what is going to be a very bad time. I have been back and forth over the skin of this earth these past 6 years, and I can tell you that much of the world has been precariously balanced on a knife’s edge of instability, fear, hopelessness and simmering rage for onto to a decade, now.

When the French Revolution arrived, Louis VI and Marie Antoinette could hardly have been more surprised. Hosni Mubarak, lying in his hospital bed in a cage Cairo, must certainly feel a similar sense of disbelief and disorientation. To be plucked from his villa at Sharm el Sheik, after he surrendered the Presidency? Incredible! The difficulty for many of you reading this (in the Developed World) is that you have lived like Louis, Marie and Hosni for the last few decades – completely out of touch with that segment of the world deemed both untouchable and insignificant. It’s not that you’ve actively avoided them, but rather than you could not even see them, and if you did catch a glimpse of them from time to time, out of the corner of your eye, you not only had no opportunity for discourse with them – you lacked the language – you literally lacked the language – both symbolic and visceral – to communicate. You might more easily have communed with an ant, or an apple tree.

Now, regrettably, many of you are about to join them. Do not worry about any lack of knowledge of their linguistics. The lingua franca of fear and disenfranchisement is one that all but the Doctor Panglosses, and the Wickens Micawbers of the world, learn with astonishing speed. Chances are, you will too.

I don’t know how much ‘play’ there is left in the system. That means I don’t know when the futile and irrational wars the West is currently prosecuting will be replaced with much larger, more costly and absolutely essential conflicts. It means I don’t know exactly when healthcare expenditures are going to decrease from 17.6% of the GDP, to somewhere in the single digits (and all the grim statistics that implies). It also means I can’t tell you exactly when the currency is going to start really inflating – in part, because I don’t know to what extent deflation from lack of demand for major commodities will occur – or when – although I note that oil prices have already dropped.

I am an expert, a bona fide expert at watching things die and observing, in order to understand the mechanics of that process, even to the point where it has proceeded well into decomposition. Human and non-human, I’ve observed so many deaths I long ago lost count. This has made me wise enough to ‘know it when I see it,’ and wiser still, to know that I lack the tools to bring precision to my understanding of the process. I can tell you when it is underway, but I cannot tell you the appointed minute, the appointed hour, or even the appointed week, month, or year of its arrival.

I said I’d keep this bearably short, and I will. We’ve been fucked. It happened quite some time ago, but in the daze of the booze and drugs, we simply didn’t feel it – until now. My message, here and now, is to first be aware that this has happened. You have no time for denial, or for recriminations. Second, neither panic nor abandon hope in the months to come. Third, immediately stop all non-essential expenditures and save everything you can. When you need to convert those savings into non-cash commodities, of one kind or another, will become apparent in due time. If you have modest and manageable debt, pay it off. If you have large debt, begin to position yourself to walk away from it with as little injury to your assets and psyche as possible. Much of the work of doing this in the US is psychological. In other places, more material preparations will likely be required. Finally, if you are a cryonicist and you want to continue to be one, be prepared to relocate. It is very likely that cryonics (biopreservation) is going to require the support of an active, cohesive and geographically united community.

I am sorry for this message. I hoped to have far more time to sieve a working group of good minds, with good hearts, to confront what is now upon us. No such luck.

]]>
http://chronopause.com/index.php/2011/08/09/fucked/feed/ 66
Science Fiction, Double Feature, 2: Part 3 http://chronopause.com/index.php/2011/08/06/science-fiction-double-feature-2-part-3/ http://chronopause.com/index.php/2011/08/06/science-fiction-double-feature-2-part-3/#comments Sat, 06 Aug 2011 08:32:31 +0000 admin http://chronopause.com/?p=1095 Continue reading ]]> Introduction & Tour of the Alcor-B Foundation’s Mobile, Arizona Patient Care Facility & Existential Colony

 Address given to Alcor-B Foundation Cryopreservation Members and Staff

Alcor-B Cryopreservation Research Foundation (ABCRF)

15 September, 2012 By Gorton Carpenter, M.D., Ph.D., President of the Alcor-B Foundation

Figure 1: The Alcors are the second, smaller and dimmer companion stars to Mizar, the bright stars that comprise the crook in the handle of the Big Dipper constellation. In the Arab world of the 5th Century CE, Mizar’s much less bright (and more difficult to see) companion stars, Alcor-A and Alcor-B were used as tests for good vision. Only someone with the clearest and most acute visions could see the Alcor’s. Alcor-B was discovered early in 2011 using Project 1640m, which makes use of the Hale Telescope’s adaptive optics system. Project 1640 gives the Hale a view almost equal to what is possible in space. The instrument also has the ability to block out the light of a star, allowing faint objects located next to a star to be seen. The Hale, armed with Project 1640, was pointed at Alcor earlier this year and found that it isn’t a single star. Alcor has a small stellar companion that hadn’t been seen before: Alcor-B, a small, dim red dwarf star about one fourth the mass of our Sun. To see Alcor-B you must have the superior vision that only mastery of the most sophisticated technology allows. Alcor-B is thus a test for the clearest and most acute vision – vision capable of seeing things as they really are – not just as they appear to be.

Figure 2: Alcor-B President, Gorton Carpenter, M.D., Ph.D.

We have covered a fair amount of ground here today, in pretty much the order we needed to, and now it is time to intellectually explore what lies below the surface here, as we are about to do physically.

The existential risks that cryonics and Alcor-B confront are well known to most of you. As you can see in this slide (Figure3), those risks have been color coded based on the risk assessment done by the Timeline to Recovery Project (TRP) analysts. Those in red were deemed the highest risk, with those in yellow coming in a close second.

Figure 3: The primary existential risks that cryonics patients and Alcor-B staff face in the coming years. Arguably, two of those risks, climate change and economic upheaval are beginning to unfold at present.

Had we remained in the greater Los Angeles area we would already be suffering from the effects of a devastating earthquake. We can congratulate ourselves for having dodged one arrow. However, there are some arrows of fortune it is hard to escape. We are already in the throes of a serious economic upheaval, indeed of a global depression, and we are also suffering from the early effects of what will very likely be a climactic catastrophe of similar magnitude to the Little Ice Age, or the MezoAmerican Drought that collapsed the civilization of the Maya. We have the technology to cool the earth, but it remains to be seen whether we have the will, as a species, or the political ability equal to the task. Unlike as was the case with the Big One, from these two existential threats we cannot run, but rather, only hope to shelter ourselves until they pass or can be overcome buy our own efforts.

Figure 4: The map above shows the probable extent of the spread of a highly infectious communicable disease in the US. By day 87 virtually all populated areas of any size are infected. The inset curve shows the rate of transmission from the start of the infection until saturation is reached at day ~ 87. There is a window of ~ 30-50 days where reverse quarantine measures may be effective, providing that the infection has not already entered our colony from contact with Phoenix, or other large metropolises.

The TRP projections show a high probability (0.07) that a consequence of the interaction of the current climactic and economic chaos will be the emergence of one or more highly infectious, and reasonably lethal, pandemic diseases. Already, the untreatable strain of N. gonorrhea, which emerged late in2010, has created major public health problems and untold misery in the Developed World. In the Third World, it has proven an unsustainable strain on the healthcare system, and has resulted in a growing population of often malnourished, and now chronically morbid individuals, who are in a weakened state and easy pretty to other infectious agents.

The emergence of carabapenem resistance in gonorrhea, the last line of defense in the treatment of this venereal disease, has broader implications, since global surveillance is increasingly detecting the plasmid that confers antibiotic resistance on this organism in other microorganisms. Given the cohabitation of N. gonorrhea with S. aureus, and the various members of the Streptococcus family that naturally colonize our skin, it is likely only a matter of time before fully antibiotic resistant strains of these organisms emerge, and we are plunged  back into the dark and terrifying time before effective antimicrobial therapy existed.

This slide (Figure 4) shows the time course and expected pattern of pandemic disease spread in the United States. We are, as you can see, located fortunately with respect to having some days notice that trouble is afoot. We have equipped the Alcor-B Facility with Nuclear, Chemical and Biological (NBC) air handling facilities, and we have added to those the ultraviolet virucidal treatment of all intake air. We have also offered subsidies and assistance to all member households on the campus of the Facility to allow them to prepare for such a contingency. And of course, a comprehensive plan of action to implement the reverse quarantine of the Facility, is also now operational. Part of the assistance offered resident members here is participation in the food reserves program which provides for sufficient long-term-storable foodstuffs to provide a sufficient calorie, protein, and micronutrient intake for 3-years, for each resident of our community.

Figure 5: Two views of the Alcor-B Patient Care Bay during the interval from the mid-1990s (left) to the early 2000s (right).

The Alcor-B patient care facilities in the 1990s (Figure 5) were woefully vulnerable to external assault – a risk we didn’t take seriously, in large measure because we thought we had no reason to. While cryonics has never been a treasured part of this culture (laughter), and while it has long been held in contempt, the contempt was never of a malicious, let alone a violent nature. We were considered non-threatening, and remained so, until the cryopreservation of the iconic American baseball hitter Ted Williams, in 2002.  At that point, an irreversible threshold was crossed.

Cryonics not only touched mainstream American culture in an ineffable way, it forced it to confront the fundamental dichotomy between its values, and ours. Not only could the culture not comprehend that one their heroes wanted to pursue practical immortality, they could not even begin to comprehend that we were not going to bend, either to their will, or their flawed system of values, and surrender an Alcor-B patient to the mob for destruction. This resulted in vandalism, gunfire into the Alcor-B facility, and the emergence of a small but damaging corps of people committed to the destruction of not only Alcor-B, but of cryonics as a whole. And not just here in the US, but in the whole of the West, as the actions of individuals in the UK and Western Europe demonstrated.

Figure 6: The Alcor-B Patient Care Bay in 2009, shortly before the move to the Mobile, AZ Facility. (Photo by Murray Ballard.)

As late as a year ago, this is how our PCB looked (Figure 6). While precautions had been taken to reinforce the perimeter walls, and to protect the ceiling/roof from assault by the use of blast-resistant polymer matting, there was no control of the facility perimeter and the degree of protection against an explosive device as simple an easy to fabricate as a large pipe bomb, was minimal. Thankfully, the titling caption on this slide is now obsolete.  We hope that the Cryonics Institute will also act to harden their facilities to protect their patients from assault by the deranged as well as by the evil and determined.

Figure 7:  A contemporary view of the patient care area of the Cryonics Institute’s facility in Clinton Township, MI

Figure 8: At left above, two of the below ground silos that housed Bigfoot dewars at the facilities of CryoSpan, Inc., in Rancho Cucamonga. CryoSpan is no longer in operation.

We considered many options before deciding upon the Mobile, Facility. One option was to protect the patients in the way that CryoSpan, Inc., did in the late 1990s (Figure 8). Mark Connaughton and Paul Wakfer did a brilliant job of implementing a suggestion from Mike Darwin, that the patient dewars be sunk in the ground in steel reinforced concrete silos. This approach provided an excellent level of protection against many of the existential risks that we’ve just covered (Figure 3). But it falls short of the comprehensive protection we wanted, and of course, it could not address protection of the staff ,or the assurance of the infrastructure required to maintain the patients in cryogenic refrigeration.

Figure 9: The relative scales of a standard, immersion LN2 storage Bigfoot dewar, and the ECD-60 dome that now houses Alcor-B’s patients.

After a great deal of searching and evaluation, we decided upon a facility close enough to a major city to make LN2, major emergency medical care, and the other amenities of civilization within easy reach, and yet be far enough away to be outside of the “ring of destruction” cities create when they implode from civil unrest, or are the target of thermonuclear attack. That’s how the Mobile site was selected. The decision was then made to use a range of Green Eye Technology modular shelter systems to fabricate the most hardened and subterranean parts of the Facility. To house the patient storage part of our operation, we selected their ECD-60 (Earthcom Dome, 60 ft diameter x 22 ft high) The ECD-60 serves as a central atrium to connect the multiple CAT 12 and CAT 25 shelter modules (Figure 10).  In addition to patient storage, the atrium is used to store additional food, houses the communications and defense center, serves as a meeting and entertainment area, exercise area, and so forth. The ECD-60 has its own NBC air supply system with a peak capability of  processing 1200 cfm of outside air. As I said previously, we have outfitted that air handling system with a UV sanitizing system. The relative size of the ECD-60 to a Bigfoot patient storage dewar is shown in this slide (Figure 9).

Figure 10: The subterranean complex that comprises the maximally hardened parts of the Alcor-B Mobile, AZ Facility. The ECD-60 Dome serves as the hub from which 11 modules radiate like spokes. There are 6 residential modules, a hospital/dental module, a laboratory module which houses an emergency cryopreservation capability, a diesel fuel storage tank, a diesel powered electricity generating plant and a greenhouse which is used for storage of supplies until the facility becomes active in an emergency.  The Patient Care Area is the in part of the ECD-60 that adjoins the clinical and technical spokes of the facility. The remainder of the ECD-60 houses the communications and defense center and is used for storage, communal dining and meeting spaces.

This slide shows the complete layout of the subterranean area of the Facility. The costing figures I will be discussing directly include only the ECD-60 Dome, one Cat-12 Residential Module and the Power Generating and Fuel Storage Modules. The other modules of the facility you see laid out here were paid for either by consortiums of Alcor-B members for their own use (via a long term lease-back agreement), or through directed donations. The Catalano Family deserves our tremendous gratitude for providing the funds for the Hospital/Dental and Laboratory modules. These facilities will not only provide medical care for staff and residents, they also allow for any member who needs to be cryopreserved to be perfused and vitrified, even under emergency “hunkered down” conditions.

This configuration, when the greenhouse is operational, will allow for the shelter of ~ 300 patients (as neuros), as well as 300 residents and staff for a period of 3 years, without recourse to outside resources.  This includes all resources necessary for survival including, food, water, medicines and NBC processed breathing air. The Patient Care Area, as currently configured, can hold 200 whole body patients and 250 neuropatients. However, in the event of an existential catastrophe of such a magnitude that neither grid power nor liquid nitrogen is available, the whole body patients would, of necessity, be converted to neuro. This could conceivably change if extra generating and liquid nitrogen production capacity were to become available.

 Figure 11: At right above, two men from the construction crew stand atop the ECD-60 shortly before the earth backfill was completed. At left above is the Patient Care Area of the facility as it appears today.

This image (Figure 11) shows the backfill operation underway (at right). The two workers standing atop the dome give some perspective to the scale of the structure and the size of the undertaking. None of these images or plans shows the vehicle access portal (VAP) or its location, for security reasons. The VAP is the way that patient storage equipment and the patients themselves are moved in and out of the facility. Because this structure had to have doors that were blast-resistant to 2o psi of overpressure, a unique design was required and a lot of very clever engineering went into making the VAP convenient to use, as well as hardened against assault.

Figure 12: The CAT-12 houses the Alcor staff and also houses a self-contained air handing and power generating facility. The CAT-12 contains enough food and water for a 90-day stay.

The CAT-12, CAT-15 Greenhouse, Power Generating Module, and the Hospital and Laboratory Modules, were all back-filled, covered and compacted before the ECD-60 was. The ECD-60 was the last structure to be buried, because it serves as the central connecting hub – the other modules had to be in place and fully wired and plumbed before ECD-60 could be buried. And since we planned to place a heavy structure atop the Residential Modules 01 and 02 (the Friendly Fortress), it was necessary to achieve a high degree of soil compaction. This took a lot of extra time and a lot of water!

Figure 13: Installation of the CAT-12 Residential Module in the Mobile, AZ facility. The inset photo shows the interior of the CAT-12 as it appears today

The CAT-12 is the smallest housing module in the facility. It was originally selected to house the Alcor-B staff, because it was both adequate, and within our very limited budget at the time.  The CAT-12 can house 10 people in reasonable comfort and it has its own independent power generating capability, galley, 60 day food supply, and air filtration system (Figures 12 & 13). Our original plan for this facility included only the CAT-12 Residential Module. When additional funds became available for an expanded shelter capability to house Alcor-B residents, we decided to leave the CAT-12 configured as originally planned, because this would allow for staff to be emplaced independent of residents (Figure 14). This might be required in situations where the risk was deemed high enough to “button up” the facility and protect the critical personnel needed to care for patients, but where the situation was not deemed urgent enough to shelter the full contingent of the community’s residents.

Figure 14: Specifications and interior layout of the CAT-12 Residential Module.

Of course, what we would really like, would be a facility that could autonomously care for Alcor-B’s patients indefinitely. That ideal facility would be nuclear fueled, and use a super-efficient thermopile (thermocouple) generator to make electricity from heat; and then use the electricity to cool the interior of the dewars to -150C, using the Peltier effect (again, thermocouples as used in the power generating thermopile (but run in “reverse” as thermoelectric cooling elements).

Figure 15: In the event of a prolonged existential crisis the current emergency survival plan calls for conversion of all whole body patients to neuro so that everyone can be cared for within 1 or 2 ITS Bigfoot units.

This would comprise a power generating and refrigerating system with no moving parts and with a life span that would equal that of the working life of the nuclear fuel. Alas, the Nuclear Regulatory Commission is not going to give us either plutonium or Strontium-90, and in any event, thermoelectric cooling technology is not quite good enough yet!

The next best thing was to come up with a system which would run as long as currently available and affordable technology would allow, and that was within our budget of ~ $2.5 million 2010 US dollars.

Due to budget constraints, it became immediately apparent that it would not be possible to operate more than two Bigfoot units continuously in an “off-grid” fashion. The solution would be to configure our long-duration emergency capability around the conversion of the whole body patients to neuro. This would allow us to easily store 200+ neuropatients using our existing packaging and packing configurations, as well our existing cryogenic storage hardware.

The person who should be credited with first coming up with this idea of using a Bigfoot dewar to store the whole patient population, and to refrigerate it independent of the grid, was Dr. Greg Fahy. Back in the late 1980s he proposed using a liquid helium refrigerated “cold finger,” of the type used to pinch hit for LN2 in electron microscopes and other kinds of laboratory equipment that need ultra-low temperatures but don’t want to be bothered with the logistic headaches associated with LN2. His calculations at that time showed that this was indeed doable with off the shelf equipment then commercially available. So, when we began to explore this option, cold fingers were the first thing we looked at. However, we soon discovered that technological advances in small capacity LN2 production plants made the cold finger option no longer attractive.

Figure 16: The CryoMech LNP-40 liquid nitrogen (LN2) plant.

Due to advances in molecular sieving technology, as well as in the design of the Joule-Thompson helium refrigerators used to cool nitrogen (or air) to the point of liquefication, it was now possible to produce LN2 economically, and in quantity, without the need for a separation tower (Figure 16). In the past, LN2 was made by first liquefying air, and then separating out the various chemical components that make it up; oxygen, water, carbon dioxide, the noble gases, and so on… This was an inefficient process and one that required a lot of maintenance of the liquefaction equipment. It also greatly shortened the time to failure of the LN2 plant.

It is now possible to separate nitrogen from the rest of the gaseous components of air using spiral wound membranes that allow nitrogen to pass across them, but which reject the unwanted gases – very much like reverse osmosis membranes. This technology eliminates the needs for a separation tower and it greatly increases the energy efficiency of the LN2 production. The absence of water (ice) also prolongs the life of the moving parts in the system, increasing the time to failure (or preventive maintenance) by about two orders of magnitude over the small capacity plants that were available in the 1980s, when Alcor-B first examined this technology.

 Figure 17: Specifications of the CryoMech LN2 plant.

The platform we selected for use here in the Mobile Facility was the Cryo-Mech, Inc., LNP-40. This small LN2 production plant can generate up to 40 liters per day of LN2 at a rate of ~1.6 liters per hour. This would be enough to keep one of the Intermediate Temperature Storage (ITS) Bigfoot dewars operational and two of the conventional immersion LN2 storage units in operation.

Figure 18: The elements which comprise the LNP-40 LN2 plant: molecular sieve membranes (upper left), air compressor which supplies air to nitrogen sieving membranes (middle right), LN2 collection and discharge dewar (center) and the Joule-Thompson nitrogen liquefier (lower left).

Necessarily, two of these plants must be purchased at the same time to provide redundancy, both during routine servicing and in the event one of the plants experiences an irreparable failure during the off grid interval. Thus, our initial capital outlay for the complete refrigerating system, including two LNP-40 LN2 liquefiers was ~ $144,000.  We purchased enough service kits to allow for continuous operation of both systems for 3 years.

Figure 19: Additional specifications of the CryoMech LNP-40.

The LNP-40 discharges LN2 into a specially designed 160 liter storage vessel from which it is subsequently dispensed to the Bigfoot units, as needed. In the near future we plan to have the LN2 from the LNP-40 collected in a larger, super-high efficiency storage vessel. This will maximize our energy efficiency, minimize the wear and tear on the liquefier, and allow us to better use the LN2 produced as “energy currency” for refrigeration where and when needed elsewhere in the Facility. As you can see (Figure 19) LN2 production does not come cheap in terms of the energy required. While industrial-scale plants are much more efficient, the LNP-40 is still impressively parsimonious when it comes it come to power consumption, since it uses only 5.5 kW.

And that leads to the next issue we had to address, namely where are we going to get that kind of power from off the grid, and reliably, for up to 3 years at a time? Diesel fuel is an option, but the quantities required are not only costly, they presented major regulatory problems with both Maricopa County, and the City of Goodyear. We took a long hard look at solar, and decided it was not only workable, but that it was the superior option. Besides, it seemed only fair that, given the merciless baking we endure from the Arizona sun, year in and year out, that we be able to convert some of that blistering energy into refrigeration for our patients!

Figure 20: The key elements (and their cost) of Mobile Facility’s solar power system.

We were pleasantly surprised to find that there were systems available pretty much as turnkey items in the capacity range we needed, principally 150 kW (Figure 20). We have two discrete solar power generating arrays: one atop the Friendly Fortress (FF, surface terminator) to the facility, and another, much larger one, located south of the FF. We have plans to enclose this second array in a more secure fashion, since it is essential to provide sufficient power to operating the LNP-40. The array atop the FF provides enough power for the staff and residents. We also have a large (buried) reserve of stabilized diesel fuel which is sufficient to operate the entire facility continuously for 3 months. Used in conjunction with solar power, this reserve of diesel will allow for uninterrupted operation for approximately 3 years. This assumes the normal number of sunny and cloudy days here in the Northeastern Sonoran Desert. All bets are off there is a nuclear winter, or other diminution of the anticipated solar radiation, of any duration.

 Figure 21: Supercapacitors of the kind used to store solar-generated electricity in the Alcor-B Facility to meet the “surge demand” of starting motors.

At the present, we have 17,442 square feet of solar electric panels. While we are relying on lead-acid batteries to store energy, we are also using Supercapacitors. Unlike batteries, which store energy in an electrolytic chemical reaction, supercapacitors store energy in the field state across the area of an electrode, which is made of a sophisticated carbon aerogel. Supercaps offer a combination of high power storage density, high voltage levels, and high charging speeds. This means much more compact and powerful storage that can be quickly recharged. For this marvelous innovation we thank Eugen Leitl, who first suggested their use, and who was of great help in generating the specifications and working the Canadian manufacturer, to ensure that the finished product met our needs. Supercaps have vastly longer life spans than lead-acid (or other kinds of) batteries. But their greatest advantage is in their unique ability to meet the large surge requirements of the motors in use throughout the facility, and especially the air compressor and the Joule-Thompson compressor motors necessary to operate the LNP-40. Supercaps quickly discharge the large amounts of energy required for the “start-up surge” of these motors

Figure 22:  The cost breakdown and key elements of the patient care essential portions of the Alcor-B Mobile, AZ Facility (2010 US dollars).

In concluding, I’d like to briefly review the key elements of this beautiful new facility, before we start our tour of it. The entire physical plant, including LN2 production, can be operated from the 150 kW solar arrays and is capable of running for up to 3 years without grid power, or supplies from the outside. The total cost to Alcor-B for the patient and staff essential parts of this facility was $2, 247,750 in 2010 dollars. The cost for the rest of the facility, exclusive of the above ground residences, was $4, 356, 925 in 2010 dollars. I believe I can confidently state that this is the most secure, the most technologically advanced and the most beautiful cryonics facility on the plane! It fulfills the dream of our founders, Fred and Linda Chamberlain, and of Mike Darwin and Jerry Leaf – all of whom realized, and worked for the day when Alcor-B would be as protected from existential and other risks, as current technology allows. Today, their dream is our reality.

The End of The Beginning

 

]]>
http://chronopause.com/index.php/2011/08/06/science-fiction-double-feature-2-part-3/feed/ 1
Science Fiction, Double Feature, 2: Part 2 http://chronopause.com/index.php/2011/08/05/science-fiction-double-feature-2-part-2/ http://chronopause.com/index.php/2011/08/05/science-fiction-double-feature-2-part-2/#comments Fri, 05 Aug 2011 08:08:31 +0000 admin http://chronopause.com/?p=1087 Continue reading ]]> Introduction & Tour of the Alcor-B Foundation’s Mobile, Arizona Patient Care Facility & Existential Colony

 Address given to Alcor-B Foundation Cryopreservation Members and Staff

15 September, 2012

By Gorton Carpenter, M.D., Ph.D., President of the Alcor-B Foundation

Alcor-B Cryopreservation Research Foundation (ABCRF)

Figure 1: The Alcors are the second, smaller and dimmer companion stars to Mizar, the bright stars that comprise the crook in the handle of the Big Dipper constellation. In the Arab world of the 5th Century CE, Mizar’s much less bright (and more difficult to see) companion stars, Alcor-A and Alcor-B were used as tests for good vision. Only someone with the clearest and most acute visions could see the Alcor’s. Alcor-B was discovered early in 2011 using Project 1640m, which makes use of the Hale Telescope’s adaptive optics system. Project 1640 gives the Hale a view almost equal to what is possible in space. The instrument also has the ability to block out the light of a star, allowing faint objects located next to a star to be seen. The Hale, armed with Project 1640, was pointed at Alcor earlier this year and found that it isn’t a single star. Alcor has a small stellar companion that hadn’t been seen before: Alcor-B, a small, dim red dwarf star about one fourth the mass of our Sun. To see Alcor-B you must have the superior vision that only mastery of the most sophisticated technology allows. Alcor-B is thus a test for the clearest and most acute vision – vision capable of seeing things as they really are – not just as they appear to be.

Figure 2: Alcor-B President, Gorton Carpenter, M.D., Ph.D.

Why?

Now that we’ve returned from the break, hopefully refreshed and ready for a bit more technical material, I’d like to continue by explaining why we undertook to create the Mobile, AZ facility. In hindsight it seems strange that those who started cryonics were not more concerned with what could go wrong, let alone what the probability was of those things actually going wrong! Looking back at the press clippings and the media coverage from that era, it is for sure that the public had no problem identifying possible problems with cryonics.

While the average Joe may have been focused on power failures, economic calamities, nuclear war, and overpopulation as obstacles in simple-minded ways that irritated cryonicists, and caused a knee jerk reaction in them that led them to dismiss these objections. Their objections were, nevertheless, real and valid. No, our patients aren’t refrigerated by compressors that depend upon a continuous supply of electricity. But they are refrigerated by liquid nitrogen which is dependent upon the power grid– any lasting disruption to the power supply, or even rationing of electricity or, the energy commodities it is produced from – and cryonics patients warm up and rot. The difference in the outcome from the objection raised by Average Joe is only 30, or 60, or 90 days, at most. The standard response of cryonicists then, and mostly now, has been, “Cryonics facilities don’t use electricity to refrigerate cryonics patients: they use liquid nitrogen (LN2). ”

And how is Ln2 made? With electricity, of course.

And while cryonicists were quick to point out (and to take delight in) the inaccurate and foolish predictions of Malthusian doom by the “experts,” such Paul Erlich. [1] It was to no small degree because of unprecedented and unforeseen technological advances in agriculture, food distribution (computerization) and food preservation technology that that this catastrophe was avoided. What those cryonicists did not do, was to the question the wisdom of courting catastrophe, absent at least a reasonably good assurance that the solutions would, in fact, be there when they were needed.

Facing Limits & Preparing For Adverse Consequences

And of course, our contempt for this issue begged the question, “Just what are the limits to population growth, even given the most optimistic projections for the growth in food supply?” Also not considered were the geopolitical and social issues that attend disproportionate growth in the populations of some peoples, while that of others remains unchanged. Nor was there any consideration given to the social-demographic effects on populations where the majority of the citizens are in their 20s, are unemployed, are poorly educated (or not educated at all) are impoverished, and are facing a future largely devoid of hope. It is also certain that we did not consider the consequences of even a single hiccup in the food supply, such as a sustained and widespread drought, that would double or treble food prices in those same already marginally fed populations.  Something that is happening right now. The result was that to an alarming extent we did not see the mass famines that now engulf much of Sub-Saharan Africa and significant parts of India. And it is certainly the case that that our predecessor cryonicists did not foresee a time when over 16% of Americans would be receiving Food Stamps (Figure 3), and that people using EBT [1] cards at the till in a grocery store would become something that everyone recognized, and that almost no one held in contempt – for fear they might find themselves with one hand in a few short months.

Figure 3: In the closing days of 2012the percentage of Americans receiving Federal Food Stamp assistance reached 16.25% of the population, up from 14.2% in June of 2011.[2] The continued deterioration of the US and the global economy coupled with major agricultural failures in the US as a result of drought and record summer heat throughout most of the nation are the factors most immediately responsible for this situation.

Of course, this situation did not materialize out of thin air, nor did it happen overnight. We did not suddenly awake to over a quarter of this nation’s children being on Federal food assistance, with more than half of the children in Alabama and Missisippi needing Food Stamps, just to stay fed.

Figure 4: The US median household income has been flat or in decline since 1999. Since 2007 the decline has been precipitous and sustained and this is consistent not with an economic recession, but rather with an economic depression.

TRP’s (Timeline to Rescue Project’s) comprehensive econometric analyses had long shown the indicators that this economic upheaval was coming. Median US household income has been either flat, or steadily declining, since 1999, and even more alarmingly, debt as a percentage of personal income has been rising steadily and dramatically since at least 1985 while savings as percentage of personal income have been declining over the same time period even more dramatically (Figures 5 & 6).

Figure 5: Debt as a percentage of personal disposable (i.e., non-confiscated) income.

Figure 6: Personal savings as a percentage of disposable income from 1985 to 2005.

These trends might have been tolerable over a longer time course if it had not been for the fact that the same kind of fiscal irresponsibility was being practiced on the macroeconomic scale, as well. As is evident in this TRP data from 2008, the stock market began to exhibit unequivocally bubble-like behavior in the opening years of the 1990s and this behavior, as indicated by an absolutely astronomical (and unprecedented) disconnect between real value and the “market value,” as indicated by the S&P Price Index continued until 2007 (Figure 7). The problem with bubbles, economic and otherwise, is that thy burst, and that is precisely what happened to the global economic bubble in 2007.

Figure 7: The incredible disconnect between price, earnings, dividends and probable real value of shares; and of economic wealth as a whole. Data analysis and projection by TRP.

This single piece of econometric data was grounds for us to initiate the Mobile, AZ, Project. However, any uncertainty we had was further reduced when we examined the really long-term trend in the purchasing power of the US dollar (Figure 8) and compared that with foreign borrowing, and the global debt to equity ratio. When the TRP predicted sharp and steady rise in gold prices began, we knew we had to act.

Figure 8: Purchasing power of the US dollar from 1900 to 2000.

The decline in the value dollar has been so steady and unrelieved that it was impossible for either TRP or for Alcor-B management, to envision a scenario in which this trend would not only reverse itself, but be sufficiently large to allow for any reasonable possibility of servicing what was then $6 trillion in debt. A quick look at this slide (Figure 9) shows the hopelessness of this situation once the exponential phase of indebtedness and interest is reached; something that clearly happened around 2008.

Figure 9: The US National debt as of 2010, the last year that the TRP econometricians were certain that valid data was being released by the US Treasury Department.

In 2008, these data, taken together, led TRP to recommend that the Patient Care Trust (PCT) increase the fraction of its assets in gold from 10% to 20% and that they restructure their real estate and securities holdings. This advice was taken, and what’s more, the PCT decided to commit half the capital accumulated from the 10% Rule to gold. That was $1.9 million in 2008 Dollars. That purchase was made in February of ’08 at a price of $865 per ounce. Half of that “investment” was liquefied on 02 August, 2011 at the peak of the uncertainty as to whether or not Congress would increase the National Debt Ceiling, at a price of $1,648 per ounce. This resulted in a net increase in working capital of $1.85 million 2010 dollars (after transaction-related expenses). In other words, the time and effort invested in TRP covered three-quarters of the cost of the Mobile Facility, including the cost of the land (150 acres x $650 acre). To paraphrase the father of cryonics, Robert Ettinger, in another context; economic depressions are only for the unprepared and the unimaginative. (Laughter)

The Final Fallback Plan

Figure 10: The analogy of cryonics as a bridge to the future, our future, implies that we must both build with enough safety factor to prevent collapse and anticipate existential crises that pose a genuine threat to its workability

However, we shouldn’t be even a little smug. The current situation has the potential to deteriorate into a nightmare scenario – one that all the foresight in the world cannot prepare us for – at least not given our small size, and even smaller resources.  While cryonics is our technological bridge to the future, it is, admittedly, a fragile one. And so one of the things I want to tell you about is the beginning of our implementation of our Final Fallback Position (FFP).

We have pursued cryopreservation as our method of bio-preservation for many reasons. For one thing, it was how the idea was handed to us by Ettinger and Cooper, the two men who first conceived of cryonics. They in turn were persuaded that cryopreservation was the best approach by virtue of the fact that many living systems can survive cooling to cryogenic temperatures if treated properly. And, perhaps just as importantly, both men were, understandably, hostages to the Arrhenius Equation, which states that the rate of chemical reaction is an invariable function of temperature. There’s a caveat to that though, and that is that the Arrhenius Equation only applies to systems that have molecular mobility, or in other words, to systems that are in a liquid or gaseous state. This caveat really didn’t become apparent until the era of vitrification began in cryonics in 2000. At that time, it began to sink in that systems in the solid state pretty much get a pass on the Arrhenius Equation, and it was the Arrhenius equation which had consigned stable, indefinite biopreservation to the realm of LN2 temperature, or below.

Figure 11: Culicidae: Dipter in Dominican amber ~40 million years old.[3]

While it had been known since the 18th century that biological material trapped in amber was remarkably preserved, even over time periods of millions of years. However, the quality of that preservation, and the implications for cryonics, were only fully appreciated a decade ago, with the demonstration of well preserved ultrastructure in some organisms preserved in amber. [4] And even more surprisingly, it has bow been verified that intact and relatively un-fragmented DNA is also present in some archaeo-amber samples. [5-7] There is also growing evidence that viable microorganisms may be cultured from amber ~20-45 million years old [8] and, incredibly, from aqueous inclusion in salt deposits that are at least 100 million years old. [6, 9-12] These findings led to the initiation of what was initially called the Brain Plastination Project, led by Dr. Ken Hayworth, which has since been rechristened the Ambient Temperature Vitrification Project (ATVP). The goal of the ATVP was to develop minimally biochemically disruptive techniques for rendering biological tissues into the solid state – and hence biochemically quiescent. The idea was to avoid the many chemical and biophysical changes induced by fixation by skipping this step, and introducing plasticizing monomers into tissue that was in a viable, or near viable state at the start of the procedure.

Figure 12: Buthidae: Scorpiones In Dominican Amber ~25-40 million years old.[3]

Figure 13: Cypress Plant Cell Ultra-structure: Baltic Amber ~45 million years old: http://rspb.royalsocietypublishing.org/content/272/1559/121.full.pdf and http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1634957/pdf/rspb20042939.pdf  [4]

This work is still in its infancy, but we have nevertheless learned a great deal, some of which is directly applicable to the development of the FFP. One of most difficult problems to be overcome when applying this technique to a whole organ the size of a human brain is, how do you keep the circulatory system accessible to allow for the replacement of the water in the tissue with the monomer that will subsequently be polymerized into a solid plastic, and to remove the truly enormous amount of heat liberated by the exothermic polymerization reaction?

Figure 14: A corrosion cast of the circulatory system of the human brain. The extensive vascularization of the brain allows for use of the circulatory system as both a mass and heat exchanger. Gas perfusion of the circulatory system prior to cooling to vitrification temperatures leave it accessible during cryogenic storage should fixation and plastination become necessary as a fallback position to cryopreservation.

This slide (Figure 14) shows the circulatory system of a human brain. This is the real deal, not a model. What you are looking at is something called a “corrosion cast.” In this case, the arterial circulation of a human brain was injected with a red-tinted plastic material and the brain was then immersed in a strong base, such as a concentrated solution of sodium hydroxide. The base dissolves or corrodes the tissue away, leaving behind the red plastic framework of the arterial circulation. It’s easy to see that the human brain is a “strongly circulated” organ – in fact, the brain normally received 1/3rd of the resting cardiac output – about 1.5 liters of blood per minute. The FFP researchers decided that the best way to achieve both heat and mass exchange was to keep the brain’s circulatory open and accessible throughout the procedure. In order to achieve this during solidification of the brain, they turned to gas perfusion – replacing the liquid in the circulatory system with gas.

One of the investigators (Mike Darwin) realized that if the circulatory system of human cryonics patients was similarly perfused with gas during cooling to vitrification, not only would cooling be hastened, thus reducing the risk of freezing, but the circulatory system of the patient would remain accessible, even during storage at -150C.  What this meant was that it would thus be theoretically possible fix and plastinate cryonics patients in the event that cryopreservation was no longer possible.

In this scenario, a patient would be removed from storage to a special apparatus, the Final Fallback Position System (FFPS), where his arterial circulation would be connected to a recirculating system of solvent chilled to -100C. This solvent would be pumped through the patient and would begin dissolving the viscous cryoprotectant-water solution in the patient’s tissues. The solvent would also contain fixative – initially formaldehyde to fix the proteins and, finally, a highly reactive metal, osmium tetroxide, that is necessary to fix the lipids; which comprise both the cellular and the intracellular membranes. Once the patient had been “solvent substituted” and fixed in this fashion, it would then be possible to safely warm him up to room temperature and introduce the monomer required for plastination. In fact, if necessary, this could be done by immersion, rather by perfusion (though this would necessitate removal of the brain from the head).

Figure 15: The head of a fresh  (~1.5 hrs post-mortem) human cadaver subjected to vitrification and then deep subzero fixation and plastination with Epon epoxy resin using the FFPS. Accelerated aging tests show both ultrastructural and molecular stability in the range of 1.5 to 2.0 million years.

What you see here (Figure 15) is a human head that has been subjected to this procedure following vitrification. It has taken hundreds of experiments with animals, and over a dozen experiments with human cadavers to develop this process to the point that it is ready for application to cryonics patients, should the need arise. As a result of a directed donation of $850,000, we have developed the system you see here (Figure 16). This system is capable of processing up to 8 neuropatients at one time. And yes, if you are whole body and the FFP needs to be implemented, you will be processed as a neuropatient, with no ifs ands or buts. Tthe urgency of such a situation will necessarily strain our capability perilously close to the breaking point, even with our current population of 122 patients.

Figure 16: The Final Fallback Position System (FFPS) for the automated processing of cryopatient cephalons from the deep subzero state, to ambient temperature fixation and solidification (plastination).

I want to be clear that ultrastructural studies conducted during the development of the FFPS do show considerable additional distortion of cellular architecture. The best results are obtained when the procedure is carried out without the intermediate step of vitrification. And yes, we are giving consideration to offering this procedure as an alternative to cryopreservation.  However, that is a topic for another time. For now, I felt it was important that you be fully informed not only that we have contingency plans in  place for the care of the patients in the event that cryogenic refrigeration becomes unavailable, but also that we now believe such a contingency has sufficient probability to warrant the considerable time and expense involved in developing and deploying this system.

You will be able to see the FFPS during the tour later today. For now, I think we should take our second break before lunch. Following lunch we will assemble here for the formal tour of the facilities.

End of Part 2

Footnotes

[1] EBT cards are Electronic Benefits Transfer cards which have almost exclusively replaced paper food stamps. They are similar to ATM cards but bear the distinctive marking of the state that issues them.

References

1.            Erlich P: The Population Bomb: Buccaneer Books; 1996.

2.            Bloch M, DeParle, J, Ericson, M, Gebeloff, R. : Food Stamp Usage Across the Country:http: //www.nytimes.com/interactive/2009/11/28/us/20091128-foodstamps.html. New York Times 2011.

3.            R. PGaP: The Quest For Life in Amber. Reading, MA: Addison-Wesley; 1994.

4.            Koller B, Schmitt, JM,Tischendorf, G.: Cellular fine structures and histochemical reactions in the tissue of a cypress twig preserved in Baltic amber: http://rspb.royalsocietypublishing.org/content/272/1559/121.full.pdf. Proc R Soc 2005, B  272:121-126.

5.            Schmidt A, Schäfer, U.: Leptotrichites resinatus New Genus and Species: A Fossil Sheathed Bacterium in Alpine Cretaceous Amber. Journal of Paleontology 2005, 79(1):175-184.

6.            Ascaso C, Wierzchos, J, Speranza, M, Gutiérrez , JC, González, AM et al.: Fossil Protists and Fungi in Amber and Rock Substrates:http: //digital.csic.es/bitstream/10261/33738/1/DEFINITIVOMicropaleontology_2005.pdf. Micropaleontology 2005, 51(1):59-72.

7.            Lambert L, Cox, T, Mitchell, K, Rossello-Mora, RA, Del Cueto, C, Dodge, DE, Orkand, P, Cano, RJ.: Staphylococcus succinus sp. nov., isolated from Dominican amber. Int J Syst Bacteriol 1998, 48(Pt 2):511-518.

8.            Cano R, Borucki, MK.: Revival and identification of bacterial spores in 25- to 40-million-year-old Dominican amber. Science 1995, 268 (5213):1060-1064.

9.            Stan-Lotter H, Terry McGenity, J, et al.,: Very similar strains of Halococcus salifodinae are found in geographically separated Permo-Triassic salt deposits. Microbiology 1999), 145:3565-3574.

10.          Grant W, Gemmell, RT, McGenity, TJ.: Halobacteria: the evidence for longevity. Extremophiles 1998, 2(3):279-287.

11.          Vreeland R, Piselli, AF, Jr, McDonnough, S, Meyers, Ss.: Distribution and diversity of halophilic bacteria in a subsurface salt formation. Extremophiles 1998, 2(3):321-331.

12.          Brown M: Molecular History Research Center: http://wwwmhrcnet/ancientDNAhtm

 

]]>
http://chronopause.com/index.php/2011/08/05/science-fiction-double-feature-2-part-2/feed/ 3
The Armories of the Latter Day Laputas, Part 6 http://chronopause.com/index.php/2011/07/12/the-armories-of-the-latter-day-laputas-part-6/ http://chronopause.com/index.php/2011/07/12/the-armories-of-the-latter-day-laputas-part-6/#comments Tue, 12 Jul 2011 23:54:20 +0000 admin http://chronopause.com/?p=860 Continue reading ]]>

Figure 1: Corporations were created by people to be potentially immortal, and yet, on average, they have life spans much shorter than people. Very interestingly, they have about the same maximum life span as people: ~120 years.

By Mike Darwin

On the Importance of the Longevity of Corporations to Cryonics

So what did 6 years and $1.25 million of last years’ money buy for Alcor and cryonics between 1983 and 1989? In the next part of this article, I’ll endeavor to answer to that question. I’ve listed most of the milestones Alcor logged during that interval as documented in Cryonics magazine, but I’m sure I’ve missed some. Since I lived those years and they were, for me personally extraordinarily happy and productive ones, I’m simply too close to them to have any pretense to objectivity. However, I think it likely that others will readily supply anything lacking in that regard. If any of you reading this have suggestions for what should appear on that list, please email them to me at m2darwin@aol.com.

Figure 2: Some of the principals who contributed to Trans Time’s dynamicity in the 1970s and 1980s. From foreground to background and left to right: Jim Yount, Jerry White, Art Quaife, Judy & Paul Segall, Dick Marsh, John Day, Norm Lewis, Carmen Brewer and Ron Viner. It is soberi9ng to note that Jerry White, Dick Marsh and Paul Segall are now in cryopreservation and Carmen Brewer is decased.

It is also important to understand that Alcor was and is but one organization and one epoch in the history of cryonics. During the 1970s and into the early 1980s Trans Time, Inc., (TT) under the leadership of Art Quaife, Jim Yount and John Day (operating in the San Francisco Bay Area) reshaped the way cryonics was both perceived and marketed. They also made a number of significant technical advances in engineering and in the mathematics of heat flow and cryoprotectant equilibration. They brought dynamicity and renewed hope and energy to cryonics and were in no small measure responsible for recruiting a number of people who were subsequently essential to Alcor’s success throughout the 1980s, and beyond. I would be remiss if I did not note their enormous contribution.  What’s more, I believe it is likely of considerable importance that a thorough analysis of the history of TT be undertaken. TT produced shareholders’ reports with detailed financials, and they produced voluminous minutes of their monthly meetings.

However, that is not data I have access to and it is not a task I’m well suited to perform. I would, however, note that given the nanoscopic experience base in cryonics compared to the rest of the institutional world, we need to carefully dissect every failure and every success. Why? Because our undertaking is fundamentally different than any that have come before it. Biological evolution proceeds with stunning results in just about the cruelest and least efficient way imaginable.[1] At no point does the process itself, or the organisms that comprise its unfolding need to stop and consider their predicament, or decide what to do next. As the gutter philosophers say, “Shit happens.”  The price of such a blind, unreasoning process is the death and destruction of countless organisms, species, communities and cultures. Tennyson summed it up perfectly in “The Charge of the Light Brigade:” “Ours not to reason why, ours but to do and die.”

The modern corporation traces its roots to the 17th century, and the emergence of “chartered companies,” such as the Dutch East India Company. It is thus a species that is only ~ 500 years old. Until the 19th century, almost no attention was paid to why and how businesses came into and went out of existence. It was just something that happened, and it was taken for granted, like old age and death in the biological world. And it was not until the opening of the 20th century that scientific methods were brought to bear to study the fates of business enterprises – for profit or otherwise – and even now, such studies are comparatively few and lack rigor.

This should come as no surprise because there really isn’t much reason for anyone to care. Enterprises are like rabbits on a farm; as long as the population as a whole is healthy, there will be plenty of them and the fate of the individuals is of no consequence. It is only when epidemic disease, or another systemic calamity devastates the hutches, that there is concern over mortality. The same is true of epidemiologists; they are concerned with the fate of individual humans only as it impacts population-wide mortality and morbidity. A consequence of this is that we know shockingly little about how to extend the lifespan of corporations. Put another (and far more ominous way), cryonicists are faced with the task of finding not just a way to indefinitely extend the human lifespan, they must also find a way to indefinitely extend the lifespan of the corporate entities they propose will care for them and recover them from cryopreservation over a period of many decades,  or centuries.

Corporation Gerontology?

The seminal cryonics thinker Thomas Donaldson was preoccupied with examples of institutions which lasted for centuries. He liked to cite the examples of the King’s Colleges in England, and of Westminster Abbey. I remember thinking at the time, “Well, that’s interesting and exciting, but it is also, I think, pretty uncommon. More to the point, how do we make that happen for our organizations?” Thomas and I exchanged letters about this, but I was never able to communicate to him that just because it has happened doesn’t mean it is likely, and it doesn’t mean it will happen for us. Thirty plus years ago, when we had those discussions, no one had yet generated any statistical data on the longevity of corporations over time.

We now know that the odds of a corporation surviving for 100 years is probably in the range of 1.0 to 1.5% , and of one surviving for 500 years, much, much lower; even if institutions like Oxford and Westminster Abbey are included in the data set. In fact, the average life expectancy for even multinational corporations of Fortune 500 caliber, or its equivalent, is only 40 to 50 years. And what about corporations as a whole, a 2002 study by Ellen de Rooij of the Stratix Group in Amsterdam indicates that the average life expectancy of all firms, regardless of size, measured in Japan and much of Europe, is only 12.5 years. Incredibly, no data exist for US corporations that I’ve been able to find.

Figure 3: The author standing next to the “John Snow Pump” on Broadwick Street, Soho, London in May of 2011. Snow is justly considered the father of epidemilogy for his work in pinpointingthe source of cholera outbreak in London in 1849 as the public water pump on Broad Street (now Broadwick Street). The municipal authorities removed the handle from the pump to prevent the local residents from usingthe water. The handle remains off of the pump except for one day of the year, John Snow Day, when it is cerimoniously put back in place.

Figure 4: The Living Company: Habits for Survival in a Turbulent Business Environment by Arie  De Geus is one of the first books to examine why corporations have the life spans that they do. As such, it is a seminal work much deserving of cryonicists’ attention.

The study of corporate hygiene and pathology seems to be where medicine was in the 17th century. There is a great deal of cupping, blistering, bleeding and amputation – mostly to no good effect – and mostly carried out by incompetents (e.g., politicians, governments and nation-states). The concept of the “public health of corporations” is still nascent and the equivalent of the “John Snow moment” [2]  of discovering how to halt the spread of business-killing epidemics, such as the one we are suffering right now, seems still in the future. The idea of a discipline in corporate medicine whose job it is to study the corporate aging process and extend corporate life span, has apparently just occurred to economists and business analysts.[1, 2] Like so much else in cryonics, no one else has the slightest clue or the slightest incentive to systematically study this problem and come up with solutions. It is simply a brutal fact of our time and place in history that the need to understand the processes attending corporate morbidity and mortality has simply not (yet) become an issue for human civilization.[3]

Figure 5: Trajectories of projectiles launched at different elevation angles but the same speed of 10 m/s in a vacuum and uniform downward gravity field of 10 m/s2. Points are at 0.05 s intervals and length of their tails is linearly proportional to their speed. t = time from launch, T = time of flight, R = range and H = highest point of trajectory (indicated with arrows). Corporations have similar arcs from launch to crash back to earth.

Clearly, some corporations remain fantastically innovative and productive over time while most do not; and there is evidence that they survive the longest. Two notable examples of the former are 3M (Minnesota Mining and Manufacturing) and Apple Computer. For a contrast with 3M, pick just about any has-been industrial giant of the past century. For a contrast with Apple there is Microsoft. While Microsoft is unarguably richer and larger, and no doubt most of those laboring there feel very secure, it is neither an exciting place to work, nor a particularly creative one. A careful analysis of these two examples of corporate robustness is beyond the scope of this series of articles, and most probably beyond the range of this author’s abilities. For now, it is sufficient to point out that these companies have interesting histories which may have value to cryonics. They also highlight the fact that most enterprises experience an arc, akin to that of a ballistic trajectory. In his seminal book The Living Company: Habits for Survival in a Turbulent Business Environment, Arie  De Geus, the former head of Royal Dutch Shell’s Strategic Planning Group, maps out a life span arc for corporations (Figure 7)and notes that currently corporations “ exist at a primitive stage of evolution; they develop and exploit only a fraction of their potential.”

DeGeus may well be one of the first people to carefully consider why corporations have such appallingly short life spans when their very raison d entrée was their potential for immortality. I believe De Geus’s work is important for cryonicists to pay attention to, and I am going to quote him at length here on his observations regarding the characteristics of long lived corporations:

 Figure 6: Arie De Geus

“After all of our detective work, we found four key factors in common:

1. Long-lived companies were sensitive to their environment. Whether they had built their fortunes on knowledge (such as DuPont’s technological innovations) or on natural resources (such as the Hudson Bay Company’s access to the furs of Canadian forests), they remained in harmony with the world around them. As wars, depressions, technologies, and political changes surged and ebbed around them, they always seemed to excel at keeping their feelers out, tuned to what-ever was going on around them. They did this, it seemed, de-spite the fact that in the past there were little data available, let alone the communications facilities to give them a global view of the business environment. They sometimes had to rely for information on packets carried over vast distances by portage and ship. Moreover, societal considerations were rarely given prominence in the deliberations of company boards. Yet they managed to react in timely fashion to the conditions of society around them.

Figure 7: The arc of the corporate life span as proposed by by Arie De Geus. Note that the terminal phase is bureaucracy where in the corporation becomes unresponsive to its environment and becomes increasingly insulated from both new opportunities and from its customers by bureaucratic mechanisms.

2. Long-lived companies were cohesive, with a strong sense of identity. No matter how widely diversified they were, their employees (and even their suppliers, at times) felt they were all part of one entity. One company, Unilever, saw itself as a fleet of ships, each ship independent, yet the whole fleet stronger than the sum of its parts. This sense of belonging to an organization and being able to identify with its achievements can easily be dismissed as a “soft” or abstract feature of change. But case histories repeatedly showed that strong employee links were essential for survival amid change. This cohesion around the idea of “community” meant that managers were typically chosen for advancement from within; they succeeded through the generational flow of members and considered themselves stewards of the longstanding enterprise. Each management generation was only a link in a long chain. Except during conditions of crisis, the management’s top priority and concern was the health of the institution as a whole.

3. Long-lived companies were tolerant. At first, when we wrote our Shell report, we called this point “decentralization.” Long-lived companies, as we pointed out, generally avoided exercising any centralized control over attempts to diversify the company. Later, when I considered our research again, I realized that seventeenth-, eighteenth-, and nineteenth-century managers would never have used the word decentralized; it was a twentieth-century invention. In what terms, then, would they have thought about their own company policies? As I studied the histories, I kept returning to the idea of “tolerance.” These companies were particularly tolerant of activities on the margin: outliers, experiments, and eccentricities within the boundaries of the cohesive firm, which kept stretching their understanding of possibilities.

4. Long-lived companies were conservative in financing. They were frugal and did not risk their capital gratuitously. They understood the meaning of money in an old-fashioned way; they knew the usefulness of having spare cash in the kitty. Having money in hand gave them flexibility and independence of action. They could pursue options that their competitors could not. They could grasp opportunities without first having to convince third-party financiers of their attractiveness.

It did not take us long to notice the factors that did not appear on the list. The ability to return investment to shareholders seemed to have nothing to do with longevity. The profitability of a company was a symptom of corporate health, but not a predictor or determinant of corporate health. Certainly, a manager in a long-lived company needed all the accounting figures that he or she could lay hands on. But those companies seemed to recognize that figures, even when accurate, de-scribe the past. They do not indicate the underlying conditions that will lead to deteriorating health in the future. The financial reports at General Motors, Philips Electronics, and IBM during the mid-1970s gave no clue of the trouble that lay in store for those companies within a decade. Once the problems cropped up on the balance sheet, it was too late to prevent the trouble.

Nor did longevity seem to have anything to do with a company’s material assets, its particular industry or product line, or its country of origin. Indeed, the 40- to 50-year life expectancy seems to be equally valid in countries as wide apart as the United States, Europe, and Japan, and in industries ranging from manufacturing to retailing to financial services to agriculture to energy.

At the time, we chose not to make the Shell study available to the general public, and it still remains unpublished today. The reasons had to do with the lack of scientific reliability for our conclusions. Our sample of 30 companies was too small. Our documentation was not always complete. And, as the management thinker Russell Ackoff once pointed out to me, our four key factors represented a statistical correlation; our results should therefore be treated with suspicion. Finally, as the authors of the study noted in their introduction, “Analysis, so far completed, raises considerable doubts about whether it is realistic to expect business history to give much guidance for business futures, given the extent of business environmental changes which have occurred during the present century.”

Nonetheless, our conclusions have recently received corroboration from a source with a great deal of academic respectability. Between 1988 and 1994, Stanford University professors James Collins and Jerry Porras asked 700 chief executives of U.S. companies-large and small, private and public, industrial and service-to name the firms they most admired. From the responses, they culled a list of 18 “visionary” companies. They didn’t set out to find long-lived companies, but, as it happened, most of the firms that the CEOs chose had existed for 60 years or longer. (The only exceptions were Sony and Wal-Mart.) Collins and Porras paired these companies up with key competitors (Ford with General Motors, Procter & Gamble with Colgate, Motorola with Zenith) and began to look at the differences. The visionary companies put a lower priority on maximizing shareholder wealth or profits. Just as we had discovered, Collins and Porras found that their most-admired companies combined sensitivity to their environment with a strong sense of identity: “Visionary companies display a powerful drive for progress that enables them to change and adapt without compromising their cherished core ideals.” [3]

Who are we Kidding?

Of course, as De Geus himself points out, these observations are just that – observations – they lack scientific rigor and they point up just how nascent an endeavor the study of corporate longevity is. There is also the fact that all of these studies are of for-profit corporations, and will likely continue to be, because that’s where the money is. Religious institutions and nation-states are already certain that they are immortal, so it seems unlikely there will be much study done in those areas of corporate health and longevity.

So, let us pause here and consider our predicament. For onto 50 years cryonicists have been trying to sell, promote, foster and even give away cryonics with very little success. What’s more, we are genuinely astonished when people look at us as if we are credulous fools. We can’t understand why they don’t “get it.” Can’t they see the dire fix they are in and thus appreciate that we’re the only in game in town?

Regrettably, that statement of the situation is a straw man; it is not necessarily a binary situation wherein if you opt for cryonics you may live again; and if you don’t you will certainly die. A good hard look at the data suggests that it is perfectly reasonable for people to believe that you can opt for cryonics and that it may be technically possible to achieve reanimation, but that you will still end up dead. Most people don’t need to run the numbers on a spreadsheet to understand this, because they are arguably more in touch with the reality of just how fragile the secular world is than are cryonicists. Indeed, the only institutions in common experience that endure for more than a century – or even just for a century, are religions, nation-states and fraternal organizations – and the odds aren’t very good (and the life spans aren’t very long) even for most of those institutions.  England, as a continuously functioning nation-state, only goes back to the Restoration after Cromwell in 1660, a mere 351 years ago. The US has an even shorter lifespan of 235 years.

 Figure 8: Offering someone a costly ticket on a plane that has a negligible probability of making the journey without falling out of sky is not much of an alternative to staying put, even in the face of certain death. At least you have the opportunity to enjoy the money you would have spent on a ticket to nowhere.

So, just who are we kidding? By way of analogy, it may be perfectly possible that a much better life in California awaits an unhappy man in the slums of Haiti today. However, he can understandably be excused if he fails to lunge at the opportunity to make the trip in an aircraft that has a 0.00000001 chance of successfully making the journey. The Cryonics Calculator is a useful tool for allowing us to objectify our assumptions about risk. But it isn’t the only such tool. The fact is that most people run that calculation at least once in their life (when they first hear of cryonics), and some run it a second time; when they find out they are dying. If cryonics doesn’t pass the credibility sniff test, then it simply does not exist as a reality for most people. They think we are as crazy as we think they are – we for buying into cryonics and them for buying into religion. But, and you have to give them this, leaving the workability of the product aside, they still have us beat when it comes to demonstrating even the barest possibility of institutional longevity much beyond a century – or three or four at most.

Think about that, and consider very carefully what we have done to demonstrate that the craft we propose to fly us across the decades, or if need be the centuries, possess any credible degree of airworthiness?

Footnotes


[1] Or so it seems to us, because we can reason and plan. Evolution is not a conscious process that can design prospectively. If that “defect” in its algorithm of progress is understood, then it is in fact remarkably efficient.

[2] Snow was a skeptic of the then dominant miasma theory that stated that diseases such as cholera or the Black Death were caused by pollution or a noxious form of “bad air”. The germ theory of disease did not come the scene until 1861 when it was proved by Pastuer. As a consequence, Snow was unaware of the mechanism by which cholera was transmitted, but evidence led him to believe that it was not due to breathing foul air. He first publicized his theory in an essay “On the Mode of Communication of Cholera” in 1849. By interviewing Soho residents with help drom  Reverend Henry Whitehead, he identified the source of the outbreak as the public water pump on Broad Street (now Broadwick Street). Although Snow’s chemical and microscope examination of a sample of the Broad Street pump water was not able to conclusively prove its danger, his studies of the pattern of the disease were convincing enough to persuade the local council to disable the well pump by removing its handle.

[3] By contrast, a great deal of largely unscientific effort has been focused on the “biology” of nation-states and empires. Arnold Tonybee’s blighted 12 volume “A Study of History” is a classic case in point (A Study of History: Abridgement of Vols I-VI, with a preface by Toynbee (Oxford University Press 1946).

References

1.            De Geus A: The Living Company: Habits for Survival in a Turbulent Business Environment Harvard Business Press; 2002.

2.            Sheth J: The Self-Destructive Habits of Good Companies: …And How to Break Them Wharton School Publishing; 2007.

3.            De Gues A: The Lifespan of a Company: http://www.businessweek.com/chapter/degeus.htm. Bloomberg Bussinessweek 2002.

]]>
http://chronopause.com/index.php/2011/07/12/the-armories-of-the-latter-day-laputas-part-6/feed/ 5
Future Babble: A Review and Commentary http://chronopause.com/index.php/2011/06/29/future-babble-a-review-and-commentary/ http://chronopause.com/index.php/2011/06/29/future-babble-a-review-and-commentary/#comments Thu, 30 Jun 2011 05:37:07 +0000 admin http://chronopause.com/?p=785 Continue reading ]]>  

  • McClelland & Stewart (October 12, 2010)
  • ISBN-10: 0771035195

Book Review and Commentary by Mike Darwin

The success of cryonics, both in absolute and relative terms, arguably depends upon the accuracy and precision with which we (cryonicists) can predict the future. Our ability as seers is important in the absolute sense, because failure to accurately anticipate the requisite social, economic and scientific developments necessary for the success of cryonics would mean that we are wasting our time, energy and money – and perhaps should  concentrate those assets on other strategies for survival (or more simply, stop tilting at windmills and enjoy our life in the here and now). Our predictive ability is also important to cryonics’ success relatively, since failure to accurately foresee the short- to intermediate-term future of cryonics is very likely to erode our credibility with both the general public and the professional and scientific communities and result in failure to anticipate lethal problems that might otherwise have been avoided.

If you doubt that this is so, there is a simple on-line “game” that you can “play” that was developed by cryonicist and computer programmer Brook Norton.  It is called The Cryonics Calculator: Derivation of Cryonics Probabilities, and it allows you to enter the risk of various possible failure modes for your hypothetical (or real) cryonics organization and then see what happens to the probability you that you will remain cryopreserved long enough to be revived: http://www.cryonicscalculator.com/index.php?option=com_content&view=article&id=2&Itemid=3.The results might be described as the reverse of compound interest: small risks for any short period of time become lethal risks over long periods of time. In plugging scenarios into the The Cryonics Calculator, I was also reminded of the liability of complex systems with hundreds or thousands of critical components to failure, even if the per component reliability is 99%. Spacecraft, as any Shuttle engineer will tell you, are a good example of this phenomenon.

So, how do we do in predicting the future? That question isn’t hard to answer in the case of most cryonicists, because there is a fairly large base of written material available to peruse in making an assessment. The answer is that we do horribly. Really horribly.

Of course, cryonicists are by no means the only people interested in predicting the future. To some extent, everybody wants to know what tomorrow holds. Economists, politicians, investors, corporations, in fact just about every human institution and enterprise, has a strong incentive to accurately predict what lies ahead.  Indeed, many people make their livings doing just that; stock market analysts, commodities advisers, government intelligence analysts, and even the neighborhood fortune teller are all  paid to peer into the future and tell us what lies in store. In answer to the question of how well these more conventional (and vastly more respected) seers perform, Canadian journalist Dan Gardner wrote the book Future Babble: Why Expert Predictions Fail and Why We Believe Them Anyway. Gardner’s conclusion, informed heavily by the research of Philip Tetlock, Professor of Psychology at the University of Pennsylvania,  is that the experts, be they economists, petroleum experts, futurists, or political pundits are about as accurate in forecasting the future as as a group of “dart-throwing monkeys.”

In fact, on average, you’d be better off making decisions about what is to come based on a simple coin toss, or deciding that “things will stay about the same.” The first question that comes to mind is, “why are the experts (and indeed humans in general) so bad at predicting the future?” Gardner explores the answers to this question in clear, easy to understand terms in text that is as concise as it is fast paced. At the most basic level, predicting the future suffers from the problems of complexity and chaos that are inherent in the real world. Want to know when “peak oil” production will occur? How hard that can be to figure out? There is clearly a finite amount of oil on the planet, it would seem we know how much is left, and it is certainly easy enough to plug in various numbers for the rate at which oil is being consumed. What’s so difficult about that?

As it turns out, even such a seemingly simple problem is enormously complex. Knowing where and how much untapped oil exists is more difficult than it seems. Technological advances cannot only make formerly unreachable oil accessible, it can also make long abandoned oil fields formerly considered “exhausted” highly productive.  And, as prices rise, previously economically nonviable sources of oil, such as oil sands, become cost effective to recover. While there is no question that oil will eventually run out, there is a huge difference between that happening in the 1980s, versus it not having happened 20 years later. Accuracy isn’t enough; precision is critically important as well.

If complexity weren’t a bad enough problem, to it can be added the problem of chaos, as in chaos theory. Modern chaos theory originated with the work of mathematician and meteorologist Edward N. Lorenz, who noticed that even infinitesimal changes to the numbers used in maths models of weather prediction resulted in radically altered outcomes.  It was Lorenz who discredited linear statistical models in meteorology and who famously asked, “Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?” The answer is, yes, it can, and thus was born the term “the butterfly effect.”  Chaos powerfully limits both accuracy and precision in predicting the behavior of complex systems, of which the everyday world is certainly one.

A central point that Gardner considers is Tetlock’s study (and resulting book) Expert Political Judgment: How Good Is It? How Can We Know? (2005) which describes his 20-year long prospective study in which 284 experts in many fields, from university professors to journalists, and with many ideological orientations, from ultra-left Marxists to libertarian free-marketeers, were asked to make 28,000 predictions about the future. Tetlock found their performance dismal: they were only slightly more accurate than chance. His study was complex, but his conclusion was brutally simple: the experts were not only worse than run of the mill statistical models, they could barely eke out a tie with the proverbial dart-throwing chimps. And there was no difference in ideological bias; capitalists and Marxists performed equally poorly.

None of this should be too surprising. Lots of other authors have explored this phenomena in detail, most notably Tetlock himself  (i.e., Expert Political Judgement), and Nassim Taleb, in his superb book Fooled by Randomness (and the later in The Black Swan). The useful things about Gardener’s book are that it presents these ideas in a highly readable and accessible format, and that it explores the underlying psychology and biology of why we humans are such “seer-suckers.” We just can’t help coming back for more – usually from the same “discredited” experts who misled us only a few years, months or even weeks before.

Implications for Cryonics

Recently, in preparation for another piece of writing, I hauled out my copy of science fiction author Robert Heinlein’s 1980 book, Expanded Universe. Included in the book are his essays “1950 Where To?” and “The Third Millennium Opens.” The former are his predictions about the year 2000 made in 1950, and the latter are his predictions about the year 2001, made from the vantage point of 1980. In reading these, it is impossible to conclude anything other than that Heinlein was terrible, in fact ridiculously terrible at predicting the future.  “Where to?” is 7 pages long, whereas his attempt to justify and waffle on the failed predictions he makes there runs to (a pathetic) 29 pages!  Heinlein was neither stupid nor ignorant; he had access to some of the best  scientific, technical and military minds of his day (as did future forecasters Herman Kahn and Robert Prehoda) and yet he failed utterly to see what lay even 20  years ahead of him, as did virtually all of the other technological seers before him.

What does this mean for cryonics? At first glance the news would seem to be all bad. It is pretty clear that we can’t predict the future, even the very near term future (5-10 years), either in terms of technological advances or man-made or natural catastrophes.  The future remains as it has always been; not just to be seen “through a glass darkly,” but not to be seen at all. However, there is some more hopeful news summarized in Gardner’s book (and present in considerably greater detail in Tetlock’s superb book Expert Political Judgment), which I believe has real and useful application to cryonics. Not all seers in Tetlock’s study were equally bad. Some were truly  terrible, and those were invariably the experts who informed their decision making on the basis of an ideological agenda. It did not matter if the experts were Marxists or Capitalists; to the extent their decision making was ideologically based, it was invariably less accurate. The best decision makers relied on multiple sources of data, entered the problem solving process with minimal biases, and had little or no ego investment in their conclusions. In other words, they were willing to revise their thinking, admit errors and reevaluate their conclusions as necessary. That’s a fairly uncommon trait in humans, even amongst scientists.

The Directors, Officers and in particular the Chief Executive Officers of cryonics organizations are the ones on whom the proximate responsibility rests for shepherding the organization’s members and patients into the future.  In the past, no attention has been given to how these people should be selected. In large measure this has been because the pool of candidates has been vanishingly small, and all too often almost anyone willing to serve had to be accepted, for lack of any alternative. Hopefully, the future will offer more choice, and if and when it does, it would behoove us to carefully examine the background and the corpus of writing of those whom we choose to lead us. We should look for the accuracy and precision of their past decision making, as well for the extent to which they are “calibrated” in their decision making. If a person says (on average) that he is  ~80% confident his predictions will come true, and in fact, ~80% of them do prove correct, then he is 100% calibrated. This is important, because knowing how much confidence to place in your judgment is often crucial. Overconfidence can be a killer, as can endless waffling and the inability to act.

Beyond the leader as seer there are, of course, many duties and qualities required. These are beyond the scope of consideration here. However, it seems a good place to start that we not empower people to decide our futures who are demonstrably terrible at predicting it. Not just ‘flip of the coin bad,’ but truly terribly bad. Such people, it turns out, are fairly easy to spot by examining the corpus of their past work and decision making. This is quite different than looking at a “markers,” such as economic success. A used car salesman, a stock broker, or a huckster of commemorative coins may be tremendously financially successful. The question that should be asked in such cases is, “At whose expense?”

____________________________________________________________

Afterword

A few months ago, I was scanning (digitizing) some back issues of Cryonics magazine from 1988, and I happened to notice I had written (with assistance from Steve Harris, M.D.) an article predicting the future of medicine 20 years hence, entitled The Future of Medicine, Cryonics, January, 1988 pp. 31-40: http://www.alcor.org/cryonics/cryonics8802.txt and in Cryonics, February 1988, pp 10-20: http://www.alcor.org/cryonics/cryonics8803.txt. I had forgotten I’d even written the article! You can read it and see how well (or poorly) I did.

That article led me to more comprehensively review my writings over the years. The results were interesting. For those of you who write, publicly or privately, I can promise you that rereading your writings in the decades to come will be a fascinating undertaking. Socrates famously said, “The unexamined life is not worth living.” Well, maybe, but I think that just perhaps, the unexamined life may be a lot more fun.

 

 

]]>
http://chronopause.com/index.php/2011/06/29/future-babble-a-review-and-commentary/feed/ 8
The Armories of the Latter Day Laputas, Part 3 http://chronopause.com/index.php/2011/06/21/the-armories-of-the-latter-day-laputas-part-3/ http://chronopause.com/index.php/2011/06/21/the-armories-of-the-latter-day-laputas-part-3/#comments Wed, 22 Jun 2011 06:25:14 +0000 admin http://chronopause.com/?p=736 Continue reading ]]>

By Mike Darwin

“When reason fails, the devil helps!”

— Fyodor Dostoyevsky , Crime and Punishment

The Entropy of Empire

There are, no doubt, many reasons why men aspire to become the chief executive officers (CEOs) of nation-states turned empires, not the least of which is a sincere desire to directly effect the course of these empires’ decision making and thus, history. Rarely is this opportunity granted, because nation-states, and especially imperial nation-states, are driven by an overweening self interest that is nearly perfectly inscrutable. Thus, the course of empires is goverened not so much by the conscious decisions of individual men, as it is by the inevitable collapse of empires; what I call the the “entropy of empire.”

The entropy of empire narrows and constrains the choices any individual actor can  make, and does so most powerfully with respect to actions of the CEO. As a result, the most powerful influence an imperial CEO or emperor is likely to have will occur not as a consequence of the formal or deliberate decisions he makes, but rather, as a consequence of his unintended actions.

 

Figure 26: As empires enter the arc of decline, their CEOs or emperors become increasingly constrained in their decision making. High stakes, risk aversion, absence of normal feedback (e.g., life in the bubble) and increasing enmity from the larger community all act to reduce the options a leader can take without catastrophically destabilizing the system, or alienating special interests that are perceived as critical to maintaining the status quo.

If you doubt the former, it is only necessary to look to the economic and international policies of the ostensibly liberal Democratic US President, Barac Obama, taken since he assumed the Presidency in 2004, and contrast them with those of George W. Bush, his “ultra-conservative” Republican predecessor. Because the entropy of empire is in play, it is hard to tell where the forigen or economic policy of Bush left off, and that of Obama began.  And so the irony is that it is most often not the ideologically informed policy decisions that mark an imperial CEO out as having effected the course of history, but rather his ancillary, collateral, or wholly unintended actions that influence the turn of events; often none more so than his personal style and temperament. It may be that one (small) reason for the vile arbitrariness and barbarity of emperors such as Nero, Caligula, Stalin or Hitler is that there is the sad realization that the only way they can truly make history is by behaving badly.

There is an unfortunate tendency to think of empires solely in geopolitical terms; as agglomerations of nation-states and territories spanning continents and being possessed of vast wealth and power. In fact, empires come in all sizes, and while all are, relatively speaking, both wealthy and powerful (and ultimately profligate and failed) they can exist whenever conditions allow for the dominance and control of an asset deemed essential by some fraction of the population. The Roman catholic Church prior to the Reformation and the  company owned Appalachian mining town are both examples of empires that can exist apart from the nation-state (and even within it), writ both very large and very small. It is a peculiarity of cryonics under current conditions that, because of its lack of widespread societal aceptance, the absence of meaningful qualitative feedback, and the high threshold of resources and credibility required to capture any of the current microscopic market, new ventures are effectively prohibited, at least within the US. Thus are empires made, and once made, they go on until their time is up; until their entropy collapses them.

The result is that the two extant cryonics organizations can and do operate in market niches which largely remove them from competetion with each other and which allow for the creation of nano-empires which span the globe – empires that are, as are all empires, top-down structured, and which operate on the basis of internally developed plans and goals which are created and executed largely independent of actual market forces and core scientific challenges. In this respect, they bear a striking resemblence in their mode of operation to the planned economies of communist nation-states.

Lessons From the Cold war Arms Race?

Figure 27: US President John Fitzgerald Kennedy (JFK).

John Fitzgerald Kennedy (JFK) stood for the office of President of the United States on 02 January, 1960. At that time he was most widely known as the war hero turned congressman who had authored the Pulitzer Prize-winning book Profiles in Courage, rather than the author of the Harvard senior thesis turned best seller[1] Why England Slept.[2] In fact, Profiles in Courage was ghostwritten,[3] and despite its Pulitzer Prize, it is a mediocre book bordering on bad.  If you want a glimpse into the kind of the mind that JFK possesed, then Why England Slept is the essential read. It is also a haunting read, because Kennedy’s thesis was that the British Empire was already in terminal decline by the early 1930s, and that Britain was in neither the military or economic condition to have opposed Hitler in Munich in 1938. Kennedy’s argument was simple, and in hindsight unassailable: Confront the Nazis in 1938 and rapidly loose the war. Appease them and buy time until the USSR and the US were brought into the conflict, or induced to materially assist Britain, and the war might well be won. When Kennedy finished his thesis in 1940, the verdict on victory was still out, and it would not be confirmed by history until five bloody years later.

Anyone who takes or accepts the credit for the intellectual workproduct of another man, as JFK did when he was complicit in the creation of Profiles in Courage, has shown himself to be a blackguard. It is therefore tempting to look at Why England Slept as merely an apologist tract for the pro-fascist politics of his father, Joseph P. Kennedy, and of pro-fascist-isolationist demographic in America and England in which he had been reared.[4] In the run up to the war, the US had been deeply divided in its sympathies.  While there was a long tradition of Anglophilia stemming from the country’s roots as a British colony, there was also a large German and Irish population which regarded Great Britain with contempt, suspicion, or both.

The economic collapse resulting from the stock market crash in 1929 had led to profound social unrest and the emergence of vigorous socialist and communist political activism throughout the Western world. In Germany, Spain and Italy the advance of communism was seen to have been effectively stopped by Hitler, Franco and Mussolini, and perhaps even more importantly, to have been countered by a movement (Fascism) which offered economic recovery as well as eugenic improvement and a new “scientific system of government.” Mussolini made the Italian trains run on time and Hitler created a vast industrial infrastructure in Germany and pulled the country out of a catastrophic inflationary depression. Both men coupled these acommplishments with a showy ideology that sparkled with glamor and promised to bring order out of chaos. To many, Fascism offered the prospect of lasting peace and prosperity.

While JFK was reared in this ideological milieu, Why England Slept reveals an original thinker who was not just curious about geopolitical history, but able to draw significant and valid conclusions from surprisingly meager data. This ability may well have been the very thing that redeemed him from what, during his campaign for the Presidency, must have seemed a minor indiscretion, and but one of many, with the facts. At the time of JFK’s presidential campaign, elements inside the US Air force were engaged in a major disinformation campaign, principally to convince as many in government as possible, as well asthe American electorate, that there existed a “missile gap” between the USSR and the US. Following the launch of Sputnik in 1957, Air Force general Curtis LeMay and the corps of Air Force intelligence analysts, became convinced that the USSR had perhaps upwards of a hundred ICBMs.[5] By contrast, the US Central Intelligence Agency  CIA (CIA) analysts argued that there were perhaps a dozen. As it turned out, there were only four; all of them liquid fuel SS-6 missiles which required hours of preparation to launch. By contrast, at that time the US had 170 land based Titan and Atlas ICBMs, and was quickly building more. It also had eight George Washington and Ethan Allen class ballistic missile submarines with the ability to launch 16 Polaris missiles; each with a range of 2,200 kilometres (1,400 mi).

“Small” Lie: Big Consequences

Figure 28: US President Dwight David Eisenhower.

Once this debate between the Air Force and CIA analysts was leaked to the press, the Democrats argued that Republican President Dwight D. Eisenhower was not spending enough money on national security and that the US was, as a consequence, open to nuclear annihilation. In his 1960 campaign JFK echoed these charges and used them to considerable effect against his opponent. Later, he alleged he that he took this position because he did not have access to the intelligence data indicating that there was no missile gap, and that in fact the US had vast nuclear superiority in every sphere of nuclear weapons delivery (bomber, ICBM, IRBM, submarine). In fact, JFK had been extensively briefed by the Director of the CIA, Allen Dulles, in July of 1960 – as had his running mate, Senator Lyndon Johnson. Dulles summarized his briefing of the Democratic candidates in a letter to President Eisenhower in August of 1960. Eisenhower, who determined the scope of the briefing, was deeply disturbed by the emergence of what he termed the “military-industrial complex” and was suspicious that the “missile gap” was yet another manifestation of its endless thirst for taxpayer dollars for needless and costly high technology military infrastructure.

JFK and his campaign manager brother, Robert Kennedy, were savvy enough politicians to turn a deaf ear to the truth about the missile gap. Thus, when JFK won the presidency in 1960 he entered office in the position of being unable (and perhaps unwilling) to curtail the explosion in the number US nuclear offensive weapons. Indeed, it was not until USSR premier Nikita Khrushchev began applying pressure in the form of the Berlin Wall to “test” JFK’s mettle as a statesman and leader, that JFK conceded, first to Khrushchev, and later to the world, that not only was there no missile gap, but that the US held an enormous strategic lead. While not news to the Soviets, this public revelation put them in the difficult position of being made acutely and unavoidably aware that they were vulnerable to a “successful” first strike nuclear attack by the US.[6]

The Caribbean Crisis

Figure 29: Soviet Premier Nikita Khrushchev.

Khrushchev and the Politburo had good reason to be concerned about the new American President’s intention’s and aggressiveness following a US sponsored failed attempted invasion of the USSR’s ally, Cuba, in April 1961 (“The Bay of Pigs” invasion).  Kennedy’s perceived “soft” response to the pressure applied from 4 June – 9 November 1961, the second “Berlin Crisis,” which resulted in the erection of the Berlin Wall, encouraged Khrushchev to pursue a redress to the imbalance of nuclear deterrence which both sides now knew and acknowledged existed. The USSR lacked the economic base to maintain ground based ICBM parity with the US, but what it could afford to do was to produce additional inexpensive intermediate range ballistic missiles (IRBMs) and, following the US’ lead in Turkey (with its Jupiter IRBMs), deploy them closer to the US – effectively negating the lack of intercontinental delivery capability. Cleverly, with this action, Khrushchev perceived that he could kill two birds with one stone: redress the missile gap – the real missile gap – and once and for all protect the USSR’s ally Cuba, from invasion by the US.

The Cuban Missile Crisis, the result of the decision taken to place intermediate and short range tactical nuclear ballistic missiles in Cuba, is not known by that name in the Russian speaking world. Rather, it is called the “Caribbean Crisis,” or alternatively, the “October Crisis.” Recently, I sat in Kiev, surrounded by a group of 20 and 30-something Russian and Ukrainian men (and one woman). I was to depart the following morning for the former Soviet ICBM complex in Pervomaysk, and a question had arisen as to why I was interested in going to such a grim place. At which point one of the group leaned forward and said, a bit expectantly, perhaps with irony and perhaps with a little bit of awe, “You know, Mike is the only person here who alive during the Caribbean Crisis!” This was answered by a chorus of “Ahhhs or Umms.” My response was grimmer still: “I was not only alive at the time; I’m old enough to remember it well.”

It is not unreasonable to view the Crisis through the lens of the conflicting ideologies and economic systems of the two nation-sate empires of the time. But to confine the discussion to that plane would be to miss the most important lesson it has to teach, in the same way that the Behaviorists of the first half of the 20th century failed to grasp the criticality of the inner workings of the mind, in giving rise to animal and human behavior. At its most fundamental level the Crisis was about a failure of two adversaries, and ultimately two men, to understand the intent and the fundamental values (moral, ethical and pragmatic) each held. The politics and the world views of the parties involved were important, but not nearly as important as what each side felt, viscerally, about the other.

The Western view of Russia and of communism, is that their economic system (collectivism) was a grotesque and nearly complete failure, which was the direct result of collectivist ideology and practice. This point of view was easily confirmed by examining the USSR, either by econometric criteria, or directly, by visiting.  Russia’s GDP was approximately 10% of the US’s in the early 1960s, and the level of consumer well being and real wealth was a fraction of that in the US at the time. The US science fiction writer Robert Heinlein toured the USSR in 1960, and his travelogue of that visit was published in his book Expanded Universe in 1980. Heinlein argued that any notion that the USSR represented a threat to the US, or to the West, was ludicrous based on his own and his wife’s (Virginia Heinlein’s) firsthand observations of the Soviet standard of living, degree of technological achievement, and population demographics.

Figure 30: In 1960 science fiction author Robert Heinlein, and his third wife, Virginia Gerstenfeld Heinlein (both shown at left on the set of the movie Destination Moon to which Heinlein served as a technical adviser and script writer) toured the Russia where they were, by chance, present during the U-2 crisis when the Soviets shot down a US U-2 spy plane overflying the USSR to gather intelligence (primarily) on Soviet ICBM installations and captured its pilot, Francis Gary Powers (who had failed to use the cyanide ampoule provided to avoid just such a contingency). Virginia Heinlein became fluent in Russian prior to their visit to Russia for the express purpose of being better able to evaluate Soviet society. Both of the Heinlein’s were virulently anti-communist and anti-Soviet. The account of their visit to the USSR was not commercially published until 1980, when it appeared in a collection of Robert Heinlein’s works entitled Expanded Universe.

Heinlein wrote of the medieval state of the country outside of the major cities and he noted, correctly, that Russia was imploding in terms of population, as early as 1960. This latter conclusion was arrived at by the simple expedient of Virginia Heinlein asking every woman she met how old she was, her marital status, how many children she had, and so on; a simple and yet highly reliable way of gathering otherwise impossible to obtain demographic data. This was the kind of spying the CIA should have been doing at the time, and apparently was not. Heinlein went on to question the very possibility that the USSR could have any significant nuclear missile infrastructure in 1960-1. He did this all based on his observations of the general state of the Soviet economy, and of the particulars of that part of their high technology infrastructure he was able to examine as a VIP tourist in 1960. The CIA could have benefitted greatly from his analysis, if not from his rabidly anti-Soviet perspective.

Figure 31: Vladimir in 988, CE. (Holland Park W11, London, UK)

There is no question that this was indeed the state of the Soviet Union in 1960, and so at first glance, it would seem that the analysis of the Soviet Union, and Russia in particular, as a backward and economically blighted state with a contracting population[7],[8] was justified.  However, that was not the whole picture and it fails to account for a number of very material anomalies for which the Russian speaking peoples may be properly credited with being unique. I know of no other example in history where a people have consistently and deliberately reached outside of their culture to select their core social technologies.

Perhaps the first and best documented example of this is the selection of Christianity as the new religion for the Russian people by Vladimir I in 987 CE. The history of Kievan Rus’[9] authored by the Kievan monk Nestor (1056 -1114 CE) during the reigns of Vsevolod I and Svyatopolk II, notes that in consultation with his boyars, Vladimir sent envoys to study the religions of neighboring nations. Nestor reports that Vladimir’s envoys reported back that the Muslim Bulgarians had “no gladness among them; only sorrow and a great stench and that Islam was undesirable due to its taboo against the consumption of alcohol and pork to which Vladimir is said to have remarked that “Drinking is the joy of the Russes. We cannot exist without that pleasure.”

Nestor also describes Vladimir evaluating Judaism and eventually rejecting it, noting that the Jews’ “loss of Jerusalem demonstrated they were no longer in God’s favor.” In the end, Vladimir decided upon Christianity, probably in no small measure because his envoys reported from Constantinople that, “We no longer knew whether we were in heaven or on earth,” describing a majestic High Mass in Cathedral Hagia Sophia, “nor such beauty, and we know not how to tell of it,” and because of the horrid end promised to nonbelievers come the end of the world and the day of last judgment. It should also be noted that substantial power and formidable wealth of the Byzantine Empire at that time was likely also no small disincentive. Forthwith, all of Rus was converted to Christianity by royal ukase.[10]

Figure 32: Vladimir Ilyich Lenin introduces a fundamentally new social and economic system, communism, to the Russian people in 1919.

The adaptation of the Ancient Greek uncial alphabet by Cyril and Methodius, to create Russian Slavonic Cyrillic was a similarly “deliberate” decision, taken in order to facilitate the propagation of Christianity in the Rus via the written word.[11] This pattern of an active and often methodical search and adaptation or adoption of intellectual assets external to the Russian culture has been repeated throughout Russian history. Perhaps the most notable and most recent example of this was the adoption of a radical, untried and very alien economic system in the form of Marxist (-Lennist) communism.[12] The economist and economic historian Jack Hirshleifer has called the opening 5 years of Soviet Communism “the most extreme effort in modern times to do away with the system of private property and voluntary exchange.” It came at an enormous cost in lives and property, which under Stalin, grew to a grotesque and fantastic degree.

Figure 33: Illiteracy in Russia, by sex, between 1856 and 1915.

The idea that the inferiority of the USSR’s  economic performance was solely an artifact of communism infuriated the Soviets, who believed that this judgment was fundamentally unfair and deliberately biased. To some extent, they were justified in this belief. At the time of the Russian Revolution in 1918, Russia was an agrarian monarchy that can only be described as being trapped somewhere between the Middle Ages and the 18th (not the 19th) century. Literacy rates prior to 1900 were in the range of 20%, and while they had risen to ~30% by 1920 (www.marxists.org/archive/lenin/works/1923/jan/02.htm) this should be contrasted with the literacy rate in the US which had been at ~ 90% since the US Revolutionary War in 1776; and a literacy rate in the United Kingdom at the start of the 20th century that was ~ 80%, by comparison.

The US, in addition to having a relatively stable governmental regime from 1800 to 1920, had also benefitted enormously from the abundance of “low hanging resource fruit” available to fuel its economic expansion.[13] At the cost of minimal blood and treasure the US had expanded westward, acquiring enormous assets in the spheres of agriculture, minerals, oil, and lebensraum. An even greater and often overlooked benefit was the vast influx of immigrants, both skilled and unskilled, and all able bodied [14] fleeing various failed and failing economies in Europe, as well as adverse religious, political, and social conditions in their home countries. This vast influx of youthful, often skilled and always able bodied human capital provided vast additional motive force to the engine of US economic expansion. The absence of any significant regulatory burden, near absence of institutionalized corruption, and ready access to vast natural resources drove the US economy forward in an unprecedented way.8

Figure 34: Growth in the USSR’s gross domestic product (GDP) from 1970 to 1990. Ofer, Gur. (1987). “Soviet Economic Growth, 1928 – 1985.” Journal of Economic Literature 25(4):1767 – 1833, and Easterly, William, and Fischer, Stanley. (1995). “The Soviet Economic Decline: Historical and Republican Data.” World Bank Economic Review 9 (3):341 – 371.

 

Figure 35: Growth in the US gross domestic product (GDP) from 1970 to 1990 (blue) compared with the averaged (curve smoothed) growth of the Soviet GDP over the same period of time.

Russia, while gifted with enormous natural resources, lacked the well developed and industrially capable base of human capital that the US enjoyed during the same period, and it was also plagued by a large burden of corruption in the pre-Soviet era. [15] These facts alone would have served as a basis for Soviet antipathy towards the West’s evaluation of Soviet economic performance during the interval from 1918 to 1961.  However, an additional and wholly justified source of Soviet resentment of the Western criticism of “deficient” Soviet economic growth and prosperity, was the effect of WWII on the Soviet economy and the Russian speaking peoples. It is hard for most Westerners who lived through WWII, let alone those alive now, to even begin to understand the devastating impact that the Great Patriotic War (the name by which the Russian speaking peoples refer to WWII, and in particular their prosecution of the war on its eastern front), had on the Soviet economy and on the Soviet peoples.

The Eastern theatre of World War II was the most lethal and costly conflict in human history to date. In excess of 30 million people were killed in this conflict[16] with brutality exercised on the civilian and combatant populations by both sides (i.e., Nazi and Soviet) that was without parallel in scale, if not cruelty, in the history of warfare. As Time magazine noted in a 2008 retrospective on the war in the Eastern theatre: “By measure of manpower, duration, territorial reach and casualties, the Eastern Front was as much as four times the scale of the conflict on the Western Front that opened with the Normandy invasion.”[17]

Figure 36: The blue line shows the probable rate of population growth and the likely absolute numbers that would have been expected if Russia had not experienced the great Patriotic War. Population growth was set back ~ 20 years as a consequence of the war. The relative sharp increases and decreases in population that occurred from 1900 to 1939 were an artifact of the Russian Empire losing territories with ~ 30 million people after the Russian Revolution (Poland 18 mil; Finland 3 mil; Romania 3 mil; the Baltic States 5 million and Kars to Turkey 400 thousands). World War II Losses were estimated between 25-30 million, including an increase in infant mortality of 1.3 million. Total war losses include territories annexed by USSR in 1939-45.

Both Nazi Germany and the Soviet Union utilized “scorched earth” tactics, with the Soviet Union suffering 20 million civilian deaths and total human losses of 26.6 million.[18] The Soviets destroyed as much as possible of their infrastructure and materiel (military and civilian) that they could not evacuate in order to deprive the advancing forces of any benefit. Subsequently, as the Germans retreated from formerly occupied territories, particularly in Ukraine, Russia’s breadbasket, SS Commander Heinrich Himmler ordered SS-Obergruppenfuehrer Prutzmann: “to leave behind in Ukraine not a single person, no cattle, not a ton of grain, not a railroad track … The enemy must find a country totally burned and destroyed.” (September 7, 1943).[19] Acting in concert, the SS and the Wermacht destroyed 18,414 miles of rail lines, flooded mines, razed factories and poisoned countless wells.

In excess of 2 million homes and other structures were burned razed. Nazi Ostland Administrator Erich Koch ordered that “the homes of recalcitrant natives … are to be burned down; relatives are to be arrested as hostages.” The Soviets estimated that the retreating Germans “razed and burned over 28,000 villages and 714 cities and towns, leaving 10,000,000 people without shelter. More than 16,000 industrial enterprises, more than 200,000 industrial production sites, 27,910 collective and 872 state farms, 1,300 machine and tractor stations, and 32,930 general schools, vocational secondary schools and higher educational institutions of Ukraine were destroyed.  Thus, the USSR was subjected to two rounds of comprehensive and devastating destruction of its resources and productive infrastructure.

The direct damage to the Ukrainian national economy caused by the fascist occupation came to 285,000,000,000 rubles.[20] Large numbers of Ukrainians were deported to the Reich for slave labor. The historian Geoffrey A. Hosking has noted that “The full demographic loss to the Soviet peoples was even greater than the raw numbers indicate, since a high proportion of those killed were young men of child-begetting age, the postwar Soviet population was 45 to 50 million smaller than post-1939 projections would have led one to expect.”[21] Considering the economic impact of WWII using total population loss from 1939 to 1945 as a surrogate marker: US losses were 0.32% of the population, UK losses were 0.94% and the USSR‘s losses were a staggering 13.88% of the population, with much of this demographic consisting of young and able bodied individuals at or near their peak period of productivity and reproductive potential.[22]

Figure 37: The mind-numbing human losses suffered by the USSR during the “Great Patriotic War” are put into perspective when compared with the total human losses of the other nation-states involved in the conflict.

What is widely remembered and understood is that the USSR achieved a military victory over Nazi Germany. Too often not considered is that the property damage inflicted by the Axis invasion was estimated to be on the order of 679 billion rubles. The war resulted in the complete or partial destruction of 1,710 cities and towns, 70,000 villages, 2,508 church buildings, 31,850 industrial facilities, 40,000 miles of railroad, 4100 railroad stations, 40,000 hospitals, 84,000 schools, and 43,000 public libraries. And the majority of the USSR’s livestock were slaughtered for food, to prevent exploitation by the enemy or as a result of starvation, disease or escape from captivity.[23]

It is interesting to note that despite its communist/socialist economy, the victory of the USSR over the Axis was in large measure a result of its war industry being able to consistently outperform the Germans, despite the enormous loss of population and land. Stalin’s much ridiculed five-year economic plans, carried out during the 1930s, had resulted in the industrialization of the Urals and central Asia – albeit at great cost in human life and suffering. In 1941, the same trains that transported Soviet troops to the Eastern front were used to evacuate thousands of hastily disassembled factories from Belarus and Ukraine, to areas far removed from the front line. When this industrial capacity was reconstituted east of the Urals, war production could be continued safely out of reach of Luftwaffe bombers. The large increases in the production of war materiel that were necessary to sustain the Soviet war effort were achieved by a large reduction in the civilian standard of living via the application of “total war,” in conjunction with assistance from the US and the UK in the form of the Lend-Lease program, which shipped vast amounts of war materiel via the famed Murmansk convoys.[24]

As German manpower losses mounted and became critical during the last half of the war, they were able to compensate for manpower attrition through the use of slave labor from conquered Eastern European countries, interned Jews and Soviet POWs. Despite the German’s superior production of raw materials they were unable to approach; let alone match Soviet production of war materiel. Much of this disparity was due to a fundamental difference in Soviet manufacturing strategy. In 1943 the Germans made an explicit strategic decision to improve quality over quantity in the production of war materiel, whereas by contrast, the Soviets settled on a strategy to refine and simplify the designs of existing military hardware while steadily increasing the volume of production.

Figure 38: Poster promoting the Marshall Plan (ERP) circa 1950. Note that US flag comprises the wind vane which is the “tail” that helps keep the windmill pointed into the wind and thus on track to generate motive force.

By contrast, the US experienced trivial losses in manpower, capital and systemic infrastructure (i.e., roads, power plants, factories, etc.) and in fact emerged from the war in a better position in terms of global infrastructure and industrialization that when it entered it. The end of WWII also saw the US become a first-class imperial power, replacing Great Britain as the world’s foremost superpower. The US exercised economic and political control over Japan and West Germany via the Marshall Plan (European Recovery Program, ERP) was the $13 billion US aid program to facilitate the rebuilding of European infrastructure and economies in the aftermath of WWII primarily to contain the spread of communism and provide a viable export market for US manufactured goods.[25] The ERP operated from 1948 to 1952 and succeeded in vastly increasing Western European productivity, standard of living and wealth, in no small measure as a result of what has been called the “bonfire of the regulations” wherein international trade barriers were largely eliminated (Europe) and choking internal regulation, excessive taxation and corruption/favoritism were largely swept away. By 1952 as ERP funding ended, the economy of every participating nation-state had surpassed pre-war levels; for all ERP recipients, output in 1951 was at least 35% higher than in 1938.[26], [27]

The ERP was also offered to the USSR and its client states, but they did not accept it. In late September of 1947, the Soviet Union called a meeting of the nine European Communist parties in southwest Poland.[28] The position of the USSR regarding the ERP was that “international politics is dominated by the ruling clique of the American imperialists” who are intent upon the “enslavement of the weakened capitalist countries of Europe.”The communist parties in Europe were instructed initiate a guerrilla struggle, including the use of “sabotage,” against US imperialism in Europe [29] The report further claimed that “reactionary imperialist elements throughout the world, particularly in the U.S.A., in Britain and France, had put particular hope on Germany and Japan, primarily on Hitlerite Germany — first as a force most capable of striking a blow at the Soviet Union.”[30]

The paranoid response to the ERP was likely a result of a complex interaction of factors including Stalin’s psychopathic and paranoid personality, the intense anger and resentment of the USSR to the punishing economic and human losses suffered on the Easter Front in the Great Patriotic War, and finally, deep and not wholly unjustified mistrust of the US’ intentions to profit from and expand its global sphere of influence, as indicated by language in the report to the effect that “the bosses of Wall Street” are “taking the place of Germany, Japan and Italy.” [31]The ERP was also described by the Kremlin as “the American plan for the enslavement of Europe” [32] and it described the global geopolitical situation as being divided “into basically two camps—the imperialist and antidemocratic camp on the one hand, and the anti-imperialist and democratic camp on the other”.[33]

Viewed in this light, the deliberate transformation of Russia into a world class industrial economy (second only to the US) by 1952 in the aftermath of the 1918 Revolution and in the face of the unprecedented devastation and disruption suffered by the Soviet state during WWII, looks considerably more impressive. It is also worth noting that during the early period of nuclear proliferation (1960 to 1970) the Russian economy grew at an impressive rate of ~5% per year – in the absence of capitalism!

Figure 39: US and USSR/Russian Strategic Offensive Nuclear Forces, 1945-1997. Note that the US was solidly ahead in ICBM capability until circa 1968 and that the US  maintained strategic superiority or parity with the USSR throughout the course of the Cold War. Source: Robert S. Norris and Thomas B. Cochran, U.S.-USSR/Russian Strategic Offensive Nuclear Forces, 1945-1996, Nuclear Weapons Databook Working Paper 97-1 (Washington, D.C.: Natural Resources Defense Council, January 1997); Robert S. Norris and William M. Arkin, “NRDC Nuclear Notebook (U.S. Strategic Nuclear Forces, End of 1997),” Bulletin of the Atomic Scientists, January/February 1998, pp. 70-72.; Robert S. Norris and William M. Arkin, “NRDC Nuclear Notebook (Russian Strategic Nuclear Forces, End of 1997),” Bulletin of the Atomic Scientists, March/April 1998, pp. 70-71.

The cost of the Cold War was staggering; for the US the bill was $19.65 trillion (1948-1991) in 2010 dollars, of which $8,731.5 billion (also in 2010 dollars) was expended directly for nuclear arms.[34], [35] Precise data for the dollar cost of the Cold War to the USSR are not available, however it is generally believed that the Soviet Union spent 12-13% of its GDP on military programs in direct support of the Cold War. The cost in hardship to Soviet citizens was vastly greater than this fraction of the GDP might suggest due to the smaller size and lower level of technological sophistication of the Soviet economy.  As the CIA noted in its 1977 A Dollar Cost Comparison of Soviet and US Defense Activities, 1966-1976 (Secret):

“…the dollar cost comparison shows Soviet defense activities to exceed those of the United States by about 40 percent in 1976. If both are measured in terms of estimated ruble costs, the Soviet activities are about 25 percent larger than the US. Thus, the effect of the index number problem is not large enough to alter the basic conclusion that Soviet defense activities overall are currently larger than those of the United States.[36]

The election of JFK, in significant measure on the basis of a bald faced lie about a mythical missile and nuclear arms gap between the US and the USSR, was, in effect, a green light given by the American electorate for the indefinite enfranchisement of  Eisenhower’s darkest nightmare for the post WWII world; the creation of the “military-industrial complex,” and with it the increased likelihood of global thermonuclear annihilation. JFK’s new military strategy, known as “Flexible Response,” constituted a commitment for the US and its allies to remain on a wartime footing and a wartime economy until the Cold War ended in 1991. Beyond the horrendous cost in dollars, the myth of the missile gap further corroded US-Soviet relations, created unnecessary doubt and anxiety in both US citizens and America’s allies. Those in the know in positions of leadership in other nations and in multinational corporations must have had frequent occasion to ask themselves, “If the US, with its massive economy and enormous technological base cannot maintain nuclear parity with the USSR, which is in reality an emerging Third World nation, what kind of leadership and security can we expect from the US?”

If Soviet expenditures for the Cold War were indeed on a par with those of the US, then the approximate dollar value for whole endeavor by both sides would be in the range of $40 trillion 2010 US dollars. To put that into perspective, that is also, give or take, the approximate net worth of the United States of America, at current market value, e.g., $50-60 trillion US! While not the only factor in the collapse of the USSR, the Cold War was very material in its financial implosion. The US and the rest of the West  have been said to have “won” the Cold War, with the tacit assumption being that the US, unlike the USSR, did not spend itself into bankruptcy making weapons of mass destruction and creating and supporting the enormous infrastructure required for their care and feeding. For myself, I doubt very much that the West will escape without paying the same price the USSR did, and possibly considerably more, and with interest.

As peoples, we squandered the funds that would have paid for us to become a space-faring people, to develop self sustaining industrial technology, and above all, to have made vast strides  in the life extension sciences – in particular, with respect to developing suspended animation – and thus, medical time travel. Failure to achieve the latter has condemned billions of human beings to death, making it by far the most expensive war in dollars and lives in the entire history of our species. We cannot continue in this fashion if we are to survive, either as individuals, or as a species. Having said that, there are many lessons we can learn from the Cold War and the effects of its aftermath, which we are now suffering. Some of those lessons will be discussed directly.

End of Part 3

References & Footnotes


[1] The book was made a best seller in large measure by Kennedy’s powerful and influential father, Jospeh P. Kennedy

[2] Kennedy, John F. Why England Slept , Funk, New York, 1940. Reprinted by Greenwood Press, ISBN 0313228744 (1981).

[3] Parmet, Herbert S. Jack: the struggles of John F. Kennedy.  Dial Press, New York (1980).

[4]Swift, Will. The Kennedys Amidst the Gathering Storm: A Thousand Days in London, 1938-1940. Collins/Smithsonian, 352p. ISBN 978-0-06-117356-1 (2008).

[5] Goodman, Melvin A. “Exaggeration Of The Threat: Then And Now“, The Public Record, 14 September 2009. Retrieved 04-19-2011.

[6] Preble, Christopher, John F. Kennedy and the Missile Gap, Northern Illinois University Press , ISBN-10: 0875803326 (2004).

[7]Demography and development in Russia“, UN Development Program, 28 April 2008. Retrieved 04-11-2011.

[8] Global decline in the Russian population did not occur until much later, however, the beginnings of this process were correctly noted by Virginia Heinlein  whose questioning was necessarily confined to large cities in Russia where the Heinlein’s were permitted to visit. Population collapse typically starts in cities where the reproduction rate is lower than in rural and agricultural demographics.

[9] Chadwick, N. G. The Beginnings of Russian History: an Enquiry into Sources, Cambridge University Press, ISBN 0-404-14651-1 (1946).

[10] Moss, Walter G. “A History of Russia Volume I: To 1917,” Anthem Press, London (2002).

[11] The World’s Writing Systems. Oxford University Press. ISBN 0-19-507993-0 (1996).

[12] Richman, Sheldon, “War Communism to NEP: The Road to Serfdom” (PDF). The Journal of Libertarian Studies 5 (1): 89–97, (1981): http://mises.org/journals/jls/5_1/5_1_5.pdf. Retrieved 04-19-2011.

[13] Cowen, Tyler, The Great Stagnation: How America Ate All The Low-Hanging Fruit of Modern History, Got Sick, and Will (Eventually) Feel Better: A Penguin eSpecial from Dutton [Kindle Edition], ASIN: B004H0M8QS (2011).

[14] All immigrants to the US were screened for disease and able-bodied status before being allowed to enter the country; Ellis Island, NY was the principal intake and quarantine facility used for this purpose.

[15] The Great Patriotic War of the Soviet Union, 1941-45: A Documentary Reader, Routledge, pp. 5712-7 ISBN 978-0-7146- (2008).

[16] Krivosheev, G.I. Soviet Casualties and Combat Losses, Greenhill  ISBN 1-85367-pp. 280-7 (1997).

[17] Bonfante, Jordan (23 May 2008). “Remembering a Red Flag Day” Time: http://www.time.com/time/world/article/0,8599,1809018,00.html.

[18] Hosking, Geoffrey A. “Rulers and victims: the Russians in the Soviet Union,” Harvard University Press, Harvard, MA. p. 242. ISBN 0-674-02178-9 (2006).

[19] Dallin, Alexander. German Rule in Russia 1941-1945, Macmillan, London (1957).

[20] Bazhan, M.P. ed. Soviet Ukraine ( Kiev: Editorial Office of the Ukrainian Soviet Encyclopedia, 1969), 569p. Published by the Academy of Sciences of the Ukrainian SSR.

[21] Hosking, Geoffrey A. Rulers and victims: the Russians in the Soviet Union. Harvard University Press, Harvard MA, p.242. ISBN 0-674-02178-9 (2006).

[22] http://en.wikipedia.org/wiki/World_War_II_casualties

[19] Hosking, Geoffrey A. Rulers and victims: the Russians in the Soviet Union. Harvard University Press, Harvard MA, p.242. ISBN 0-674-02178-9 (2006).

[20] http://en.wikipedia.org/wiki/World_War_II_casualties. Retrieved 05-25-2011.

[21] The New York Times, 9 February 1946, Volume 95, Number 32158.

[22] The Lend-Lease program operated in the form a loan. Repayment of that loan to the US proved a significant drain on the economies of its Allies. The UK completed its debt repayment to the US in 2004 and the USSR in 2006: http://www.telegraph.co.uk/finance/2945924/Reborn-Russia-clears-Soviet-debt.html. Retrieved 05-11-2011.

[23] Milward, Alan S. The Reconstruction of Western Europe 1945-51, Berkeley, University of California Press (2006).

[24] Eichengreen, Barry, The European Economy since 1945: Coordinated Capitalism and Beyond, p. 57 (2008).

[25] Mills, Nicolaus, Winning the peace: the Marshall Plan and America’s coming of age as a superpower. Wiley. New York, ISBN 978-0-470-09755-7, p. 195 (2008).

[26] Behrman, Greg. Most noble adventure the Marshall plan and the time when America helped save Europe. New York, Free Press, (2007).

[27] Wettig, Gerhard, Stalin and the Cold War in Europe, Rowman & Littlefield, New York, ISBN 0742555429, p. 146 (2008).

[28] Wettig, Gerhard, Stalin and the Cold War in Europe, Rowman & Littlefield, New York ISBN 0742555429, p.142 (2008).

[29] Behrman, Greg. Most noble adventure the Marshall plan and the time when America helped save Europe. New York, Free Press, 2007.

[30] Wettig, Gerhard, Stalin and the Cold War in Europe, Rowman & Littlefield, ISBN 0742555429, p. 146 (2008).

[31] Wettig, Gerhard, Stalin and the Cold War in Europe, Rowman & Littlefield, ISBN 0742555429, p. 145 (2008).

[32] Rhodes, Richard, Arsenals of Folly: The Making of the Nuclear Arms Race, Knopf, New York (2007).

[33] Stephen I. Schwartz, ed., Atomic Audit: The Costs and Consequences of U.S. Nuclear Weapons Since 1940, (Washington, DC: Brookings Institution Press, 1998). Further information about Atomic Audit can be found at <http://www.brookings.edu/projects/ archive/nucweapons/weapons.aspx>. Retrieved 06-11-2011.

[34] CIA, A Dollar Cost Comparison of Soviet and US Defense Activities, 1966-76 (Secret) SR-77-10140, October 1977, p. 2.

 

]]>
http://chronopause.com/index.php/2011/06/21/the-armories-of-the-latter-day-laputas-part-3/feed/ 4
Achieving Truly Universal Health Care http://chronopause.com/index.php/2011/02/14/achieving-truly-universal-health-care/ http://chronopause.com/index.php/2011/02/14/achieving-truly-universal-health-care/#comments Tue, 15 Feb 2011 07:12:07 +0000 admin http://chronopause.com/?p=254 Continue reading ]]> By Mike Darwin

Contemporary Medicine: Playing Peek-a-Boo with Death

The Proper End of Medicine

In my experience, physicians get evaluations that parallel those most often given to prostitutes; they don’t pay enough attention to you, there is typically a lack of the desired amount of enthusiasm and intimacy, the critical emotional moments are faked, the encounter never lasts long enough, you may discover as a consequence of your visit that you have a loathsome disease, the hourly rate is punishing, and  the most you can hope for is palliation, not real relief.

The first thing that anyone needs to understand about medicine is what its proper goal is. That’s actually pretty simple: to cure disease and maintain good health. No further qualifications are necessary. Once that proposition is accepted, it then should become obvious that the end goal, and the ultimate ideal of medicine, is to keep people alive and in good health indefinitely. Even the television physician pundits, like Dr. Oz, and CNN’s Sanjay Gupta, can sense that this is coming, and in fact, Sanjay pretty much said so:

“Practical immortality may now be within our grasp thanks to the cutting-edge scientific research and amazing medical breakthroughs that are coming at such astonishing speed we can hardly keep up. “1

- Sanjay Gupta, M.D.

Figure 1: Immortality is a lot like sex, in that it is something few will admit to wanting a lot of, and that almost everyone thinks their neighbor has too much of.

So, despite the fact that most people, when asked, will recoil in horror from the notion of personal, biological immortality, the fact is that that is exactly what they expect, exactly what they want, and exactly what they will effectively demand. Unfortunately, most of the medicine we practice today is not only not going to provide immortality any time soon, it is going to bankrupt us while turning us into human fleshpots, sitting in the solariums of nursing homes and extended care facilities the world over.

The late great physician-philosopher-writer Lewis Thomas, first identified the problem in 1974, in his classic book, The Lives of a Cell. Thomas wrote insightfully about three different kinds of medicine we humans are capable of practicing, classifying them as Prevention, No Technology, Low Technology, Halfway Technology and High Technology. I have created a color wheel of Thomas’ medical technologies, and added one of my own: Futile Technology – the kind of technology which increasingly characterizes the medicine we practice today (Figure 2).

Figure 2: The Spectrum of current medical technologies practiced today

Prevention and low technology medicine are fairly straightforward concepts and do not need our attention here. But High Technology Medicine (HTM), Futile Technology (FT), and especially Halfway Technology (HT), deserve considerably greater scrutiny.

Figure 3: The distribution of heal care expenditures over a lifetime: most of the money is spent on the last decade of life and most of that on futile and ineffective medicine that turns dying into a long, costly, morbid process. Nation-states would do better to advise their citizens to smoke and drink with abandon if they are truly interested in reducing the amount of suffering, and avoiding bankrupting their entire economies. Alternatively…

Thomas elegantly describes Halfway Technology as follows:

“Halfway technology represents the kinds of things that must be done after the fact, in efforts to compensate for the incapacitating effects of certain diseases whose course one is unable to do very much about. By its nature, it is at the same time highly sophisticated and profoundly primitive… It is characteristic of this kind of technology that it costs an enormous amount of money and requires a continuing expansion of hospital facilities… It is when physicians are bogged down by their incomplete technologies, by the innumerable things they are obliged to do in medicine, when they lack a clear understanding of disease mechanisms, that the deficiencies of the health-care system are most conspicuous… The only thing that can move medicine away from this level of technology is new information, and the only imaginable source of this information is research. The real high technology of medicine comes as the result of a genuine understanding of disease mechanisms and when it becomes available, it is relatively inexpensive, relatively simple, and relatively easy to deliver.” —Lewis Thomas2

Figure 4: Polio victims on Iron Lung support in a school gymnasium in the mid-1950s.

To understand the difference between HT and High Technology medicine (HTM), Thomas used the paradigm of the Polio epidemics of the mid-20th century as an example.3 Today, very few people understand either what the Polio epidemics of the 1950s were like, or the divergent ways that both researchers and clinicians sought to address the scourge. On the one hand, hundreds of thousands of people were contracting polio, with many suffering irreversible bulbar paralysis; which meant that they were unable to breathe. They were conscious and very much alive, but they were unable to use their respiratory muscles to ventilate themselves.

Figure 5: Jonas Salk, discoverer of the first clinically deployed Polio vaccine.

For many of such paralyzed patients, a relatively new medical device in the form of the Iron Lung represented an opportunity to go on living. In some patients the paralysis retreated, or vigorous physical therapy allowed them to recover sufficiently that they could once again breathe on their own.4 But for many, the Iron Lung was a life sentence of paralyzed immobility inside a cylindrical ‘steel coffin’ as seen in Figure 4.

A minority of scientists at that time believed that it might be possible to defeat Polio by the expedient of a vaccine,5 and so an intense competition for funds began between those who sought to secure more Iron Lungs to support the ever growing legion of patients with respiratory paralysis, and those who sought to understand the fundamental basis of the disease (in the context of their technological era) and treat it by eliminating it.6 In other words, these researchers wanted to get to the root cause of the illness and stop it there, rather than to develop every more sophisticated Iron Lungs, and other prostheses, to pinch-hit for the muscles rendered useless and atrophied by Polio.

In one of the most rapid translation of bench research to bedside application, Jonas Salk and his colleagues developed a workable Polio vaccine7 which was rolled out for public use in 1955 – the year I was born – just in time to ensure that yours truly would not end up in an Iron Lung, or be ‘lucky’ enough to escape a brush with Polio confined to wheelchair, or using walking braces with a case of ‘simple paralysis,’ as did US President Franklin Delano Roosevelt. HT is Iron Lungs, and the Salk, and later Sabin vaccines, were HTM. Insulin treatment for diabetes, artificial hearts/ left ventricular assist devices, total hip and knee replacements, and drugs for hypertension are also all halfway medicine. They treat the clinical manifestations of disease with varying degrees of efficiency and cost effectiveness, but they do not ever affect a cure.

Figure 6: The chart above shows the approximate current distribution of health care dollars by the type of medical technology.

Most people aren’t satisfied with medicine because, fundamentally, most of medicine is still unable to address the underlying causes of disease. Medicine used to be almost completely worthless from a scientific standpoint, and physicians were mostly about diagnosis, prognosis, and hand-holding. Since 1900, medicine has been able to treat a few illnesses definitively, but it is still mostly about indirect, and halfway treatments that are costly, and in no way definitive. The older the patient, the more this will be the case, because the real cause of most disease in the developed world is aging. Cancer is aging, hypertension is aging, diabetes is aging, most of urology (including sexual dysfunction) is aging, stroke is aging, Alzheimer’s is aging, and 3/4ths of every health care dollar is currently spent on the deterioration and chronic illnesses that are a product of aging.

Patient’s don’t have the experience of their physicians examining them, and then saying, “Well, you see, the problem is right here in alpha-N-1-letterbox beta gene. I’ve re-coded that gene, as well added on the 5.01 Rejuvenation and Cell Repair Package. Within 10 days to two weeks, you should have the libido of a 14-year-old, the appearance of a 20-year-old, the stamina of an endurance runner, and the life span of a Sequoia redwood.” Doctors can only very indirectly manipulate the machinery that makes us run, and they usually have to use costly and chronically applied chemicals that are typically as poisonous as they are therapeutic. Nobody is truly going to be thrilled about doctors until they can really cure illnesses.

Figure 7: This is as good as it gets. If contemporary medicine reached it goal of ‘squaring the curve,’ and extending the average life span to at or near the maximum lifespan (~ 120 years), this is how you can expect to end up. The photo at left is of ‘Supercentenarian’ Marie Bremont, taken on her 115th birthday in 2001. Absent definitive regenerative medicine, all that contemporary medical technology can do is to maintain the function of existing tissues, until the point where physiological reserves become so depleted, that the slightest environmental challenge causes death.

Because medicine is mostly a halfway business, it will have a terrible bite-back effect that will ultimately render it unsustainable, and/or leave the patient population truly pissed-off. The better we get at halfway medicine, the less cost effective it is. Dialysis, total artificial hearts, Gleevec, Viagra, anti-hypertensives, artificial joints, all these things ultimately create more and more people whose survival, let alone their restoration to health or full function, will eventually cost more than they, or the society as a whole, could possibly pay for, even with a  with a lifetime of hard work. In fact, right now medical care is consuming 16% of the US GDP, and that will rise to over 20% in just 4 years! No economy that we have any experience with can tolerate that kind of cash drain – it is simply unsustainable. And what’s worse, a quick glance at Figure 6 shows that currently we are spending ~25% of our health care dollars on FT – medicine which does no good, and which usually causes  harm, by inflicting further suffering, and damaging the ability of the system to deliver care to patients who can genuinely benefit.

Figure 8: US healthcare costs projected to 2015 as a percentage of the GDP. The Newsweek article actually makes the case for the inevitability, and the fiscal wisdom of, “killing Granny.”8,9

So, we have only two pragmatic alternative systems:

1) Stop treating a large fraction of the population for chronic or costly illnesses and focus all our resources on diseases which can be definitively and cost-effectively managed. This sounds good, but leaves us forever trapped in a cruel world of stunted, or absent medical progress, and beset with a population suffering and dying, without hope.

Figure 9: Some examples of definitive, curative, high technology medicine. Stem and gene cell therapy are now in the early stages of being developed, and will constitute the first wave of ‘regenerative medicine.’ If the current rate of technological progress is sustained, it seems reasonable to presume that the first autonomous cell repair technologies will begin to see laboratory application by the closing decades of this century with mature applications coming sometime in the opening decades of the 22nd century.10

2) Change how we spend our money, and focus on developing definitive medicine. Definitive medicine means controlling aging and addressing and treating the cause of disease at the molecular level, where it originates.

Until we have definitive medicine (which will happen incrementally, not suddenly) we  need a cost effective way to manage patients with maladies that can only be halfway treated, or not treated at all. That means we need to develop truly reversible solid state suspended animation (SA). Within the past few years, a technology has been developed that allows for deep subzero cooling of organs and tissues without any ice formation occurring. This is possible because high concentrations of cryoprotective agents – antifreeze compounds exactly like the ethylene glycol and propylene glycol used in automotive radiator antifreeze – reduce the probability of ice formation and propagation, and at high enough concentrations, these compounds can prevent ice formation completely, even at the slow cooling and warming rates that are necessary to transfer heat out of and back into and large masses of tissue, such as human organs, or human beings.11-14

Because ice formation is inhibited, further cooling of the system results in increasing viscosity until, finally, the solution becomes so viscous that it has solidified. The point at which the system makes the transition from an ultra-viscous liquid, to a molecularly arrested glass (the glass transition point, Tg), is the point at which essentially all chemical and biological activity is halted. This arrest of chemistry occurs independent of the temperature, and is a consequence of the immobilization of the chemical reactants in the glassy substrate of the cryoprotectant-water mixture. The conversion of an aqueous solution into a glass is know  as vitrification, from the Latin word vitrum, which means glass.

As a result of recent advances in vitrification technology, it is now possible to vitrify entire organs.15,16 However, a significant remaining problem is to inhibit ice growth during re-warming in some tissues that do not equilibrate well with the cryoprotectant chemicals. This ice formation occurs as a result of the generation of ultramicroscopic ice nuclei during cooling, which cannot grow or propagate, because there is too little energy in the system, and too little time for ice to grow as a result of steady and continued cooling to Tg.17 While this is a significant hurdle to be overcome, it is a technological, rather than a theoretical one. Additionally, virtually all research on reversible vitrification of organs has been conducted on the kidney, and this presents a unique challenge, because the interior of the organ, the renal medulla, is very poorly circulated. It is thus difficult to load a sufficient concentration of cryoprotectants into this poorly vascularized tissue to completely avoid ice formation.

Figure: 10: Visual appearance of ice in a rabbit kidney that was cross-sectioned during rewarming. The kidney was perfused with a cryoprotective mixture called M22 at -22°C, cut in half, immersed in M22, vitrified at -135°C, and eventually re-warmed at ~1°C/min while being periodically photographed. Times (1:30 and 1:40) represent times in hours and minutes fom the start of slow warming. The temperatures refer to ambient atmospheric temperatures near the kidney but not within the kidney itself. The upper panel shows the kidney at the point of maximum ice cross-sectional area, and the lower panel shows the kidney after complete ice melting. Both panels show the site of an inner medullary biopsy taken for differential scanning calorimetery in order to determine the actual concentration of cryoprotectants in the tissue with high precision. [http://cryoeuro.eu:8080/download/attachments/425990/FahyPhysicBiolAspectsRenalVitri2010.pdf?version=1&modificationDate=1285892563927]

A fair summary of the current technological state of the art is that it is likely now possible to place complex mammalian organs, such as the rabbit kidney, into indefinitely long suspended animation, with little or no loss of viability, and no damage as a consequence of structural disruption due to ice formation. The use of radio frequency, or microwave illumination to speed rewarming, the use of warm gas (such as helium) to perfuse the organ’s circulation, or a combination of both, may offer a workable solution to the problem of ice formation during rewarming. The point is, we are now palpably close to a fully reversible technology for inducing suspended animation in complex living systems. Perhaps most impressively, one mammalian kidney has survived vitrification and rewarming sufficiently intact to permit immediate support of the rabbit from which it was removed (as the sole kidney), until the animal was sacrificed for evaluation 29 days after the organ was re-implanted.

SA brings the cost of caring for any patient down to ~$1K dollars per year, and accomplishes something that most people find very satisfying in life; namely giving their intractable problems to someone else to solve. Since we are already sending our bills for our health care to the future, it only seems reasonable that we should send ourselves along, too.Figure11: The first kidney to survive vitrification shortly before it was removed from the animal for evaluation after supporting its life as the sole kidney for 29 days.

The problem of aging will be a difficult one to solve, and answers will come in iterations that unfold over the remaining decades of this century (providing we make this problem a priority – or survive as a technological civilization). While no well informed scientist would argue that controlling (and even reversing) senescence is impossible, or even unlikely before the turn of the century, no responsible scientist would argue that we understand aging in the way the Wright Brothers did flight at the dawn of the last century. There is a huge difference between a technological problem, and a theoretical problem. Flight in 1907 was a technological problem, as was interplanetary rocketry in 1937: the theory was there, but the technology wasn’t.

Figure 12: The first sustained heavier-than-air human flight, on 17 December, 1903.

Today, with respect to suspended animation, we are more or less exactly where Orville and Wilbur Wright found themselves after they first achieved sustained heavier-than-air human flight, on 17 December, 1903, and where Goddard and Von Braun were in 1937 in terms of achieving space travel. We now have a full understanding of the theoretical requirements for suspended animation, we have a solid proof of principle (mammalian organs have recently been reversibly cryopreserved), and what remains to be done is to develop and expand the technology. Once medicine has SA, it has a cost effective means to offer the promise of definitive therapy to almost every dying patient, at a fraction of the cost currently expended to deliver futile, halfway, or custodial care.

Figure 13: A rough estimate of the cost for induction of suspended animation in humans, followed by indefinite maintenance at ~135oC until such time as definitive medical treatment, including mature regenerative medicine to treat aging, becomes possible. (Estimate prepared by the author using current costs of cryonic cryopreservation at the Alcor Life Extension Foundation in Phoenix, AZ; with assumed reduction in costs as a result of economies of scale in long-term cryogenic care).

Physicians, politicians, bioethicists – they are all confronted with a looming practical and moral catastrophe, and a difficult choice to make. It would appear that the reasonable and humane choice is to fund and perfect the only research that offers the prospect of indefinitely stabilizing care to virtually all dying patients: reversible human suspended animation. That is the most pragmatic, humane and cost effective solution to the inadequacy of, and the problem of, universal health care. Quality, satisfying, universal health care coverage is eminently affordable, indeed it is a trivial societal expense the instant halfway, futile, end of life, and custodial care are subtracted from the package. We can do that humanely with SA, or inhumanely with restrictions on care. There really isn’t any middle ground.

References

1)      Gupta, S. Cheating Death. Grand Central Publishing 2009. ISBN: 044650887X

2)      Thomas L. The technology of medicine. In: The Lives of a Cell. New York, NY: Viking Press; 1974:31–36.

3)      Silver, JK, Wilson, DJ. Polio Voices. Santa Barbara: Praeger Publishers. 2007 p. 141.

4)     Wilson DJ. And they shall walk: ideal versus reality in polio rehabilitation in the United States. Asclepio. 2009;61(1):175-92.)

5)     Smith, JS. Patenting the Sun: Polio  and the Salk Vaccine. William Morrow & Co; 1st edition. 1990. ISBN-10: 0688094945.

6)    Juskewitch JE, Tapia CJ, Windebank AJ. Lessons from the Salk polio vaccine: methods for and risks of rapid translation. Clin Transl Sci. 2010;3(4):182-5.

7)    Bookchin,  D, Schumacher,  J. The Virus and the Vaccine, Macmillan, 2004. ISBN 0312342721.

8)    Office of the Actuary in the Centers for Medicare & Medicaid Services annually produces projections of health care spending for categories within the National Health Expenditure Accounts, National Health Expenditure Projections 2009-2019: http://www.cms.gov/NationalHealthExpendData/downloads/NHEProjections2009to2019.pdf.

9)    Chernew, ME, Baicker, K, Hsu, J.The Specter of Financial Armageddon — Health Care and Federal Debt in the United States. NEJM (10.1056/NEJMp1002873) was published on March 17, 2010, at NEJM.org. http://healthpolicyandreform.nejm.org/?p=3170. Retrieved December 23, 2010.)

10)   Freitas, R. Nanomedicine. 1999: http://www.nanomedicine.com/NMI.htm.

11)   Fahy, GM, Wowk, B, Wu J. Cryopreservation of complex systems: the missing link in the regenerative medicinesupply chain. Rejuvenation Res. 2006; 9:279-91.

12)   Wowk, B, Fahy GM. Toward large organ vitrification: extremely low critical cooling and warming rates of M22 vitrification solution. Cryobiology. 2005; 51:362.

13)   Wowk B, Fahy GM. Ice nucleation and growth in concentrated vitrification solutions. Cryobiology. 2007; 330.

13)  Wowk, B, Thermodynamic aspects of vitrification. Cryobiology 2010; 60(1):11-22.

14)   Fahy GM. Vitrification: An overview. In: Liebermann J, Tucker MJ, eds. Vitrification in Assisted Reproduction: A User’s Manual and Troubleshooting Guide. London: Informa Healthcare 2007; (in press).

15)   Fahy GM, Wowk B, Wu J, Paynter S. Improved vitrification solutions based on predictability of vitrification solution toxicity. Cryobiology 2004; 48:22-35: http://cryoeuro.eu:8080/download/attachments/425990/FahyImprovedVitriSolns2004.pdf?version=1&modificationDate=1285892436630

16)   Fahy, GM, Wowk, B, Pagotan, R, et al. Physical and biological aspects of renal vitrification. Organogenesis 2009; 5:3, 167-175: http://cryoeuro.eu:8080/download/attachments/425990/FahyPhysicBiolAspectsRenalVitri2010.pdf?version=1&modificationDate=1285892563927

17)   Fahy GM. The role of nucleation in cryopreservation. In: Lee REJ, Warren GJ, Gusta LV, eds. Biological ice nucleation and its applications. St. Paul: APS Press 1995; 315-36.

http://i293.photobucket.com/albums/mm55/mikedarwin1967/Untitled-27.jpg
]]>
http://chronopause.com/index.php/2011/02/14/achieving-truly-universal-health-care/feed/ 1
London at Apogee: A Reflection on the Criticality of Life Affirming Values to Economic Viability and Personal Survival http://chronopause.com/index.php/2011/02/08/london-at-apogee-a-reflection-on-the-criticality-of-life-affiriming-values-to-economic-viability-and-personal-survival/ http://chronopause.com/index.php/2011/02/08/london-at-apogee-a-reflection-on-the-criticality-of-life-affiriming-values-to-economic-viability-and-personal-survival/#comments Wed, 09 Feb 2011 07:18:02 +0000 admin http://chronopause.com/?p=79 Continue reading ]]> By Mike Darwin

Figure 1: London, 2012, still magnificent, but declining from apogee.

 

Timing is (almost) Everything

Most of the essay below was written on 16 June, 2008. It was written as a post (including all of the financial graphics) for a critical care medicine list-serve called CCM-L – a venerable, but at the same time quirky and eclectic forum, for discussing critical care medicine and topics that could transform it, for good or ill (even if they are seemingly far afield from the brass tacks of medical technology, per se). Sometime ago, I’m not sure quite when, I was alerted to the work of the economic analyst (and economist) Michael Mandel’s in the form of his seminal article, “Why the Jobs Crisis is Actually an Innovation Crisis”, by something I saw in an e-communication from the Cato Institute. Mandel’s analysis started me working to rewrite my CCM-L piece into a more rigorous (and less personal) exposition of my ideas. Subsequently, Mandel’s article prompted a more exhaustive and insightful analysis of the current financial meltdown by the even more prestigious economist Tyler Cowen.

Figure 2: Andrew J. Galambos (at right).

I downloaded Cowen’s e-book The Great Stagnation, within days of its e-publication. I agreed with both men’s ideas, and especially liked Cowen’s emphasis on how badly scientists are treated in our civilization and the truly catastrophic effect this has had on innovation and technological progress. In fact, two people in the cryonics community who have known me for some years, Steve Bridge and Danila Medvedev, will, I think, attest that I’ve been going about this idea at length, and bordering on hysteria, for years. In the case of Steve Bridge, it has been for at least 30 years! So, I was very receptive to Cowen’s arguments, because they fit precisely with what Andrew. J. Galambos taught me in 1973-4. (In exchange for the many wasted hours I spent listening to him, the idea of the primacy of intellectual property in creating wealth, and of the high cost of mistreating its creators, made it all worth while).

However, as I read Cowen’s arguments, I realized that he was was fundamentally wrong about a critical issue – namely his assertion that productivity has not increased dramatically sincethe early 1970s as a result of technological advance – in particular stunning advances in computer and informational handling technologies, as well as the associated increased automation in manufacturing. I believe that he and Mandel suffer from not moving about in the world enough and looking at, and really seeing and understanding industrial and manufacturing advances first hand and in the context of 2oth century technological history.

Consider what l am doing now. I am ‘publishing’ the equivalent of a book every couple of weeks! And not just any book, but a lavishly illustrated and very nicely formatted technical book with enormous amounts of research and references. In 1982, it would have taken several times the amount of effort, and a week or two in time, to generate one technical paper 10 pages long, with almost no art, and very few references. Researching papers was nightmarish, and involved driving into Los Angeles, an hour away, and searching the card catalogs at the UCLA Medical Library… Preparing a single article for publication took at least half a dozen people, if you include the printers’ staff.

This would all be of less relevance if the same increase in productivity was not also clearly going on across the board, in the whole of the economy. And it is, and Mandel and Cowen would have realized that this is so, if they ever were closely interfacing with a robust cross-section of industries over a decade’s long period of time.

Because of cryonics, I had to see, interface with, and become familiar with the manufacturing processes of businesses of many kinds. I toured Dow Chemical’s Zionsville research and production facilities, and spent countless hours learning to do tissue culture there when I was 12-13 years old, in 1966-7.  So, Mandel and Cowen are clearly wrong about the increase in productivity being slowed to the extent they assert. They are, however, absolutely right that the average man on the receiving end of this technological bonanza has been seeing increasingly flat gains in personal wealth, and now is seeing net losses. That is real. But what they are failing to see, and the question they are failing to ask is, “why did this happen and where did all the wealth from that increased productivity go?” Somebody undoubtedly got richer!

The answer is in my 2008 CCM-L post: this wealth was stolen by hidden taxation and largely hidden (and unappreciated) inflation coupled with irrational and unsustainable expenditures in areas such as health care and Defense. My rough guess is that conservatively, 60 to 70% (more now) of each individual’s productivity is taken from him before he ever gets his paycheck. The decaying, and now failing infrastructure in the US is proof positive that this wealth isn’t going to fund basic and ‘good’ things government can do – such build and maintain roads, dams, utilities, land reserves –  and maintain basic public health. INSTEAD, IT IS BEING STOLEN AND WASTED.

So, whilst not disagreeing with either Cowen or Mandel, my point is that they miss the elephant in the room, and if their suggestions for ‘fixing’ things were implemented tomorrow, it would merely result in ever increased thievery, and certainly in no net long term gain for the real producers of wealth.

I have reproduced here my by 2008 post, with minor edits for grammar and punctuation, as it first appeared, minus some irrelevant personal dialogue at the beginning. To this material I have added my analysis of Cowen’s and Mandel’s works, and I have highlighted this added text in gray type. I believe that these two men have indeed identified a critically important idea underlying the current economic crisis and slowdown in innovation. But I do not believe that it is, as Cowen asserts, primarily due to “exploitation of the low hanging technological and natural resource fruit on the tree.” It instead due to theft, and more importantly in the long run, to the debased treatment of the real engines of wealth creation and technological advance: the scientists and all the others in our civilization who innovate and who produce new ideas. Most thinking people now (hopefully) understand that collectivism, when applied to industry or ordinary commerce, quickly destroys incentive and impoverishes the economies of those who practice it. What must now  be understood is that collectivism, with respect to intellectual property (the real font of all wealth and progress) is many orders of magnitude more destructive than it is when applied ‘only’ to the means of production.  Unfortunately, I believe that this realization will be even more transformative, than it will be long in coming.

- Mike Darwin, 22 January, 2011

 

As Good as it Gets, for Now?

While I was in London, in June of 2008, I met a banker from Citigroup; a very intelligent and savvy fellow from New York City. Most of you have met, or at least know of the type I’m referring to here; big apartment in Manhattan, house in Connecticut, spouse who is 10 years older and very senior in Merrill, Lynch, Pierce, Fenner, Cooper and Smith. He was in the UK to try to contain massive losses to Citigroup from major clients and, to be blunt, try to save Citigroup itself! I was also told that of Band of America was holding so much suspect paper that it, too, might go-under.

I have long believed a major depression, i.e., collapse of much the apparent worth of the financial system in at least the US, was imminent (imminent in these terms means ~10 years + or – 5). What I was told during this lengthily and intense conversation in late June, made my mouth go dry – and I felt real fear. I have to say that no small part of my happy recreation and carefree time in the UK was motivated by a steadily growing gut feeling that the ‘party was almost over,’ and that this might well be my last chance to really enjoy a world that was at the apogee of 8,000 years of technological civilization. Sounds hyperbolic and melodramatic, but it is nonetheless true – the part about ‘apogee’ and ‘civilization,’ that is…

London, in particular, is at apogee; at no time in the 2,000 year history of the city has it been cleaner, wealthier – more equitably wealthy – healthier, more rational, safer, less prejudiced, more inclusive, more tolerant, better educated, or, arguably, happier. These things are not open for debate: they can be proved by both the objective numbers, and the evidence of the senses. If you visit London, and if you have the slightest inkling of the history of the city, it is obvious that things have never been so good for so many; rich and poor alike (and yes, it is possible for things to be better, even for rich people; dentistry, hip replacement, plastic surgery…) It may credibly be argued that London, and a few other cities like it, represent the absolute apogee in the quality of human life, wealth, technological sophistication and justice, in the entire history of our species.

People on average are more educated, enjoying vastly better health, are living vastly longer, suffering (physically) vastly less and are, in absolute terms, vastly wealthier than at any time in human history. In London, they are seemingly happier, or at least more visibly joyful, than most urban populations I’ve observed. London is one big happy party compared to Moscow, where ‘grim’ and ‘stoic’ are the operational adjectives. People are  relaxed; they laugh and talk in public spaces; they smile, incessantly listen to music on tiny devices, and chat on the phone and converse earnestly with each other on the tube, and even on the buses in the wee hours. I am no cheerful Pollyanna, and yet I notice these amazing and rapid positive, and largely technologically driven changes.

A Chance Encounter and an Historical Perspective

Where I differ from most people is that I could put this experience, this chance encounter, with a worried banker, into the context of just about all we know of human history. An arrogant statement, I know, but it is true. I felt both blessed and awed to be in a place, and at a time, where our species has made life the best it has ever been for 15 million people in one place at one time – and to have the opportunity to enjoy it at almost every level. I confess, there was a strong, indeed almost all-pervading sense of “after me, the deluge...”

Figure 3: Gold, the last refuge of the desperate ‘investor.’

My Citigroup acquaintance, and later a young banker from Credit Suisse, whom  I also talked to for several hours, explained the real mechanics of what had been done to the financial system, and how far-reaching and catastrophic they each thought it was going to be. Ironically, while they both had substantial personal means, they were also, as is often the case when you are inside a problem, totally at sea as to what to do personally to protect themselves. I told them that if it was as bad as they said it was, the only thing they could do was to get out debt, convert a fraction of spare cash to gold (preferably K-Rands) and take physical delivery of it; no gold shares, or mining stocks. I said that how much of their assets to convert to gold was a function of how bad they thought and felt it was really going to be – and how much they were willing to risk in terms of large, short-term losses from cashing out time-locked financial instruments, liquidating investments real estate at a suboptimum time, and, of course losing the productive return on their money until the worst of the deflation-inflation was over. Gold neither earns nor creates wealth, but rather is a costly storage mechanism for wealth that halts productive return on capital, is inconvenient to hold and use as currency, and is highly susceptible to theft – without recourse to recompense by insurance.

When I learned in full what the financial community had (most recently) done to precipitate this crisis I was at a loss for words, and so upset that I just sat in the dark and later that day told the good friend I was staying with, in detail, what I had learned. He was a bit sobered by it, but he was, and probably still is, too bewitched by what he sees as the silver lining to this situation (at least from his perspective), to understand how dire it could well become for all of us. He owns properties, and the structure of his investments is such that the falling value of currency and home prices is very advantageous to him, and he believes that his choice of investment properties is essentially secure against even grave economic downturn and contraction. So, I could not reach him with my ‘news’ that sky had indeed started to fall, and clearly I could not reach anyone else with my gloomy message. I thought back to my reading of Thomas and Morgan-Witt’s The Day The Bubble Burst in the early 1980s, about the 1929 Stock Market Collapse, and the ensuing global depression, and about Paul Krugman’s The Return of Depression Economics in 1999 (the book that inspired me to cut loose and start travelling and living (it up)) and, of course, Nassim Taleb’s masterpiece The Black Swan. I realized that this is always the case in such times and in such situations, so I quit playing Chicken Little, and continued to grab life  by the balls, (to be both crude and accurate) while I could.

Any fool can say that there will be another 1929, or a similarly bad crisis, and be right on the money, with only one niggling little problem: saying when it will happen with a useful degree of precision. I posted my timeline on CCM-L some time ago, and it ranged from tomorrow, to a year from then, or maybe two years, at most. However, this is neither sufficiently precise, nor does it carry sufficient weight, to have allowed me to be taken seriously. So, I mostly kept my mouth shut, both before and after my trip abroad.

One of the things always said, and widely believed whenever any economic system has gone non-rational, is that “it is a new paradigm” a “new economy,” and that the “underlying economic rules of the game have fundamentally changed.” I find evidence of this going back as far as the Roman Republic! So, when I began really howling about the bad track I saw US finance going down, way back when the Savings and Loan fiasco occurred, and Peter Keating went to jail (briefly), nobody paid attention to me, and they were absolutely right not to have: a whole generation of people came and went in that interval, and vast fortunes were made and lost. But on balance, more were made than lost.

How was this possible? Ironically, the answer turns out to be that the economic lunatics were right, there was a fundamentally new economic paradigm, and it had changed everything, but just not in the way they thought. I’ve actually written on CCM-L about this insight, and this paradigm change. But I’ll recap it here, because it is, shortly (a decade or two from now?) to become received economic wisdom, and part of the canonized explanation for why what is happening now happened, and how it happened.

Figure 4: Archetypal ‘modern’ kitchens spanning 80 years of US history.The slowing pace of everyday technological change is most evident in the transition from 1920 to 1940 – a point when so-called ‘low hanging fruit technologies’ such as electrification and civil engineering had matured.

Within the last few months, two prominent economists, Michael Mandel and Tyler Cowen have been writing about what they term is an ‘innovation interruption’ or an ‘innovation slowdown.’ Mandel puts his best case forward in his 2010 article, “Why the Jobs Crisis is Actually an Innovation Crisis”. In his ebook, The Great Stagnation Cowen, argues for this being an era of technological stagnation in considerably greater detail, and invokes a different primary causal mechanism, namely the idea that the US economy has been driven by a binge supper on the “low hanging technological and natural resource fruit” that was available until the early 1970s in  the US.

I first noted, and wrote about this slowdown in everyday technological  transformation, and working and middle class wealth, in the late 1980s, and I christened it the “Family Affair” paradox. Family Affair was a television sitcom that ran from 1966-71. The program was bland and unremarkable, but one thing I noticed watching a rerun, whilst confined to a doctor’s waiting room, was that nothing in the living spaces that the program played out in would have been out of place in 1989. The only noticeable changes were automobile styling and womens’ couture. Since I am a ‘classic movie buff’ with an insatiable appetite for films from the 1930s through the early 1960s, I noticed that it would be impossible to mistake an interior from 1940, for one from 1950 – let alone 1960. This phenomenon was brought home to me again some years ago when I toured “A kitchens of the 20th Century” exhibit at the Smithsonian, in Washington, D.C. The slowing of apparent technological change and growth in wealth (spaciousness and quality of finishes), mirrored what I had seen in film and television – and what Mandel and Cowen assert their economic analyses also show.

Figure 5: Economists Michael Mandel (left) and Tyler Cowen (right).

It wasn’t just fashion or style that had changed; it was the large and very evident increase in the wealth of the working and middle class population, as reflected through those films. The magnitude of change had obviously greatly slowed during the interval from ~1970 to ~1990, but why?

Mandel and Cowen have a number of arguments to explain this slowdown, and far more importantly, they have the solid economic data to back it up. Their arguments are that at least these factors in play:

  • The US was still largely virgin territory at the beginning of 20th century. It still had vast fossil fuel reserves, a huge reserve of unexploited and agriculturally rich land, and a largely rural population of intelligent and ambitious young people who were uneducated – and thus could be turned into a valuable asset not previously available. That’s all gone now.

Figure: 6: Harvesting the low hanging technological fruit?

  • The most transformative basic technologies that have created widely distributed wealth and jobs were largely products of the 19th century scientists and entrepreneurs. Edison, Tesla, Ford, Dow, and DuPont did the ‘easy’ and highly profitable science that really produced widespread improvement in the standard of living, such as artificial lighting and widespread electrification. These ‘easy’ technologies have now been harvested, much as is the case, say, in physics. Newton could integrate physics and invent the calculus whilst sitting under an apple tree in Woolsthorpe. All he needed was his mind, and a considerable body of observation that required little technology, but a great deal of time and patience (both of which were at premium before the Industrial Revolution). Still, he did not require a large hadron supercollider, nor any other multimillion, let alone multibillion dollar infrastructure. Those days have largely vanished from physics; and from many other branches of science , where the ‘oil oozing out of the ground’ has been scooped up and sold. Science, like oil exploration, has had its easy pickings, after which point, discoveries get more difficult and costly to tease out of the natural world.
  • This civilization treats scientists like garbage. I have written letters to the London Times and to The Guardian expressing my sadness and frustration that while there are countless statues to soldiers and generals – there are none to Darwin, Newton, Telford, Turing, or the countless other British minds that essentially enabled scientific-technological civilization. My suggestion to put a statue of Newton or Darwin on the Fourth plinth in Trafalgar Square (which is empty of sculpture or statuary) was rebuffed in a snide email from an editor at The Guardian who suggested that I “return to the US, and erect such statues in my hometown.”

All of these observations are, of course valid, and no doubt contribute significantly to the technological slowdown. However, as attractive as I find these ideas, they do not begin to adequately explain the current economic situation, nor do they really explain why we are all not a lot richer than we are. I say this because by any measure of actual increase in the efficiency of production, we should be much, much richer than we are. Cowen, in particular, makes the point that advances in computer technology have not really improved the lot of the average citizen in the West, in terms of real wealth. And he is right. What he is wrong about is why this is so, because clearly, if you actually visit contemporary factories, all you see is automated production – production that is orders of magnitude more efficient than were pre-computerized methods of production. So, the question that should be asked is, “Where did all that extra wealth go; if it didn’t go into the average citizen’s pocket?

Below is a graph of the DJIA performance from 1900 to the present. It is how traders, investors and bankers like to look at the data, and if you look at it that way, you will be very reassured. In fact, no matter how you plot the data it confirms something very important: we (the West) have gotten richer as a civilization at greater speed than at any other time in human history (i.e., through productive means, that is, as opposed to violent conquest and pillaging).

Figure 7: DJIA average from 1900 through the present.

You can also see this if you look at the per capita oil consumption as a marker of economic activity, and more arguably, productivity from 1900 through the present, not only in the West, but across the globe. We have gotten undeniably wealthier and at a seemingly impressive rate.

Figure 8: Oil Consumption in leading economies from 1990 to 2005.

However, these data are misleading, because they only shows the overall share value and the absolute energy expenditure, and they do not show it in relation to the overall wealth generated, or more importantly, to the fraction of that wealth that is retained by the people who really create it. For instance, look at the curve for Japan if you want to see the future for the US, and much of the rest of the world; gold-rich nations with little debt and little bad paper from the US and Europe, will likely be spared some of the worst of what is to come.

Determining how much wealth has been diverted is a tricky thing to determine, because you have to subtract out various kinds of parasitism, which is very difficult to do on an objective basis. For instance, governments do provide real benefits and services for their citizens; clean water, transportation infrastructure, law enforcement, the justice system, sanitation and public health, and so on. These things are costly and necessary. But how costly are they in both absolute and relative terms, and what is essential, and what is simply waste, theft, or bad decision making? This is really hard to know.

When I first got interested in this issue my perspective was very simplistic: historically nation-states (and empires) collapse when the taxation burden on their populace exceeds ~30% of the GDP, or its equivalent. So, it would seem simple enough to look at the taxation rate and come up with a number as to how close to that historical margin we are at any given time, assuming, of course, that this number still applies, because in the past, peoples’ incomes were just barely enough, or a little more, than was required to keep them alive, or in a modest (very modest by today’s standards) zone of comfort. When I was a child, people did not have a lot of chattels, and essential items like shoes or school clothes were costly, and they were socially a ‘big deal’ to purchase. I would estimate that 10 to 15% of the kids in my primary school had shoes with holes in them that they lined with cardboard. It was a working class neighbourhood, and money was tight. That situation has almost vanished from the West, and in fact, we export discarded clothing and shoes to the Third World by the millions of metric tonnes each year!

So, 30% taxation on total earned income almost certainly does not equal the breaking point for parasitic load today, because that breaking point probably represents the fraction of earned (and available) income you have to take from a population before they start to be acutely uncomfortable, begin to be unable to buy necessities AND become fearful about their prospects for long term stability, and even for their personal survival. The huge absolute growth in wealth has thus destroyed the utility of this at least 2,000-year-old indicator for predicting how much theft is intolerable to the continued functioning of the socioeconomic system.

Figure 9: Regulatory burden constitutes a form of hidden taxation – and if the funds are squandered  that constitutes theft from the people who produced that wealth.

During the Nixon-Greenspan nightmare of wage and price controls that occurred as a result of inflation secondary to the Vietnam War and the Arab Oil Embargo and (a sharp rise in oil prices) in the early to mid-1970s, I was very concerned. I was only 18 at the time, and knew nothing about economics. However, I did know that the only way I got my second cryopatient frozen was by the expedient of his son bribing two people with substantial sums of cash to get access to petrol for the cars we needed to move about in, and to obtain aviation petrol to fly the Cessna (with the patient in it) from Cumberland, Maryland to Detroit, Michigan. Fuel was rationed by edict of the Federal government in 1973.

By1975 I had learned about the ~30% rule, and also learned a little about economics, having read von Mises (and discovered the ‘Austrian economists’), as well as Bastiat. I then tried to figure out how much taxation there really was. Initially, I was very reassured. But then, I realized something that I’ve not seen discussed elsewhere. The real extent of taxation was vastly greater than it appeared, because it was stacked, compounded and hidden in ways never before imagined, nor possible – indeed not even technologically possible – let alone socially acceptable.

I was buying a can of  navy beans beans in the Alpha –Beta grocery store in La Crescenta, California one day in 1974 and I looked at the picture of the beans on the can and thought of home, back in Indiana. Farms, farmers…and then it hit me. The farmer pays property tax on his land, and that tax has risen sharply to pay for all kinds of government services unheard of in the past. The farmer also pays sales tax on every item he buys, both for personal use, and in many cases, for the conduct of his business. He pays a hefty tax on the fuel he uses to run his agricultural equipment, and of course he pays income taxes, social security taxes, and many use or permitting fees, as well. He also pays Social Security and income tax charges on his workers, as well as state Workers Compensation Insurance on each employee.

The farmer then sells his beans to agricultural wholesalers who have many of the same tax expenses, and the wholesalers in turn sells to the processors; who have even greater tax burdens, because they has many of employees, and a generally broader and deeper interface with commerce. And more transactions mean more opportunity for taxation. The retailer who puts the can of beans on his shelf has even more of these expenses, and he must advertise, as well, which is highly taxed (TV licenses, lots of employees, and on and on). Finally, I walk into the store, having paid my SSI, income tax, and countless use fees (driver’s license, road toll fees, professional licenses, and on and on) and I buy my can of beans and then I pay the California state sales tax that was, I think at that time, between 1% and 2% (again, that was in 1974).

I realized then that real tax burden was both vast, and either not calculable, or very difficult to calculate. Since 1974, sales tax in California has increased to 7.25%, and local supplementary taxes (city or county sales taxes!) are allowed up to 8.75%. The average is around 7.5%. But that is just the tip of the iceberg in the increase in predation on income since 1974, because since 1974, truly vast amounts of costly regulation and fees have been put in place at every level, and on every kind of economic activity. The biomedical research operations I’ve managed over the years are a case in point. We began to pay the following fees and charges starting around 1980, which we’d never seen before:

  1. Biohazardous Waste Disposal Permit,
  2. Hazardous Chemicals Disposal Permit,
  3. Disposal and in-house tracking paperwork costs associated with 1 & 2 above,
  4. Dead research animals were reclassified as biohazardous waste meaning we could no longer use rendering plants that paid us for the carcasses, but instead had to pay for costly and documented incineration by specialized providers with no economies of scale,
  5. Lawn sprinkler backflow prevention system and yearly validation fees,
  6. Knox Box installation (to allow fire-fighters 24/7/365 access to our building by essentially providing them with a key!),
  7. Hazardous materials and fire inspection fee,
  8. Annual fire sprinkler testing, validation, govt. Mandated maintenance and inspection fee,
  9. All contractors working in the city of Rancho Cucamonga now required to purchase a city business license!

10. Sharp escalation in Federal licensing fees for  required permitting,

11. Imposition of state pharmacy board inspections and associated fees,

12. Imposition of the requirement that a tiny business (and others like us) have workers compensation insurance which we had to pay at the highest rate (nursing home because they could not classify us easily,

13. Huge increase in the cost of scientific and professional books because publishers and wholesalers were not allowed to carry over unsold inventory past the end of the fiscal year without paying property tax; you see the same thing with care dealers who MUST clear their inventory at their fiscal year end or pay taxes on all the cars they own sitting on the lot,

14. Enforcement (for the first time) of property taxes on all our of equipment and furnishings,

15. Requirement for building and planning permission for all construction of any kind; put in a sink, pay a huge fee plus the costs of the application, the professional drawings, and the city inspection after it is installed. Even non-load bearing simple devising walls or cutting a hole in wall to create a pass-through from one room to another required permitting and often inspection – all at a charge,

16. Countless new requirements that we use only certified or professional personnel in every area of operations from animal care techs (including the people who cleaned the kennels) to laboratory facilities and chemical suppliers used,

17. Total prohibition on the use of any expired product in animal research; this alone doubled our costs of carrying out experiments,

18. Required training for employees to prevent sexual harassment or other discrimination,

19. Explosion is costly signage requirements; we had to buy very costly signs to post sexual harassment laws, workers comp laws and employee rights, warnings to employees about defrauding workers comp, anonymous complaint system, lighted signs over all exits (thousands of dollars in wiring alone), countless safety signs (no mouth pipetting, caution slippery when wet, eye wash, first aid and safety signage,

20. All accessible pipes and cables had to be labelled at fixed intervals as to what they conducted and had to arrows showing the direction of flow; new construction had to be labelled before the walls were closed… Inspection of same for a fee.

I can’t even remember all of the fees, and if I did, it would run to many pages. Beyond these fees, there was the new practice of charging to any new construction or development, both ‘conditional use fees’ and infrastructure fees. So, for instance, if we wanted to put in additional parking spaces, we were told we would have to pay for installation of an electronic cross walk sign on the street corner 3 blocks away, at a cost of $11,000! Developers were told they had to pay for vast runs of city sewage or water pipe, or pay for sidewalks, stoplights, and other infrastructure formerly paid for by general taxes on everyone. Large projects, such as the local super Wal-Mart here in Yucca Valley, have been stalled for years; in this case because they want Wal-Mart to pay for an entire sewage/water treatment plant for the region (Yucca Valley, Joshua Tree and 29 Palms). This would then allow these cities to switch from septic systems to sewage; at which point the cities involved will charge each householder about $4,000 for the sewer line and require them to pay for a private contractor to hook it up to the household drains; roughly another $3,000 cost, on average.

All of the above is hidden taxation, and often represents theft, or the shifting of taxes from productive creation or maintenance of civil infrastructure to wasteful, or actually destructive (actively destructive) spending. By the mid-1990s, I reckoned that actual taxation on productivity was in the range of 60% to 70%. It simply was not obvious, because so much of it was hidden, and it was not felt, because the wealth being generated by increased manufacturing and data handling efficiency was so vast, that it was now possible for people to be very comfortable on what amounted to the leavings from their real productivity.

Figure 10: Computerized and automated manufacturing have greatly improved economic productivity; but where have the profits gone?

Most of the increased productivity, and the means to siphon it off, have come about as a result of technological advances in computing, information handling, telecommunications, and the automation of manufacturing. Despite these obvious and huge increases in the efficiency of the process of production, distribution and sales at every level, people did not work less than they had in the previous few decades; instead they worked a lot more.

As an example, all of the benefit accrued from not having to wash clothes by hand on scrub boards, starch them, hang them out to dry, take them down and iron them, and to not have to spend 4-hours a day preparing meals (mostly from scratch), or not going to the market every day because of refrigeration, freezing, and the development of ‘Twinkie-style’ food preservation technology (no more weevils in cereal or flour) was effectively wiped out when women had to go to work. Now, two incomes were required to maintain the same, or a lower, standard of living. Granted, people were getting fundamentally new types of goods and services. But even factoring in the increased productivity, they had to work much longer and harder than they previously did, to get the same basic standard of living. This was not possible in the past, because people already worked just about as hard as they could simply to maintain a subsistence (survival) existence. There was no ‘play’ in the system.

Finally, not considered was what was happening to the real value of the currency. The easiest and second oldest ways for nation-states to steal from their people is by ‘manipulating’ the value of the fiat currency they issue (the first way is taxes). Greenspan, evil genius that he was, cleverly manipulated the Federal Reserve so that inflation never seemed an issue. However, the real story was there for anyone who wanted to look for it to see, in the form of the decline in the real purchasing power of the US dollar, over time, as shown in Figure 3.

Figure 11: Purchasing power of the US dollar from 1900 to 2000.

I can’t (for the reasons just cited) show you the real fraction of wealth being taken from taxation in all its forms, but it is at least as great as you see above; and much of it is arguably addable to the loss in purchasing power from direct and indirect taxation.

Anyone with a picogram of commonsense, and who understands the laws of thermodynamics and the conservation of matter and energy on a practical, if not a theoretical level, knows that sooner or later something has to give; TANSTAAFL[1]!

What that breakpoint is (in objective terms, as the percentage of the GDP removed from productive use by the populace) and when it will come, is virtually impossible to predict. All that increase in productivity is fundamentally new, and it did change the paradigm. It changed the paradigm by allowing unprecedented theft of profits to go on at a higher percentage of the GDP, and for a longer period of time, than was ever possible in history before. Essentially what happened was that for the first time in history, the rate of increase in productivity more or less kept pace with the rate of increase of what can only be described as a historically unprecedented and savagely rapacious growth in the theft of wealth from its producers.

Arguably, critical care medical professionals are uniquely equipped to understand this, and then only ones who are older, who practice in the Third World, or visit and work there doing locum tenems. Historically, critically ill people died very rapidly. In fact, most people died in pretty good shape. I see this change reflected in cryonics from the time cryonics was first proposed in 1964, to the present. In the 1960s we used to get ‘high quality’ materiel in terms of patients. Cryopatients had mostly intact organ systems, mostly intact vascular endothelium, and were almost never massively oedematous: some congestive heart failure patients had oedema, but there simply were no ‘Michelin men and women’ (let alone Michelin children!). Application of externally imposed homeostasis now allows us to extend the dying process, such that survival is possible with levels of organ functioning so compromised, and systemic injury so severe, that these patients would have died long before they reached that state in 1964. We can do this because we have vastly better ‘half-way medicine’[2] and the vastly greater wealth to allow us to apply it. This is the perfect analogy for what has been happening in the current economy and it also directly responsible for some of the economic woe, both now and to come

Around 1995 we went from a situation where increased productivity was keeping pace with increased theft, to the beginning of true ‘critical economic illness,’ which can be defined as the point at which you must start incurring debts to maintain homeostasis (e.g., early volume replacement by fluid resuscitation in the patient in shock). The graph below shows household debt in the US from 1980 through the present (projected to 2010):

Figure 12: Debt as a percentage of personal disposable (i.e., non-confiscated) income.

What isn’t shown is that about 80% of this debt is now in the form of what is euphemistically called ‘short-term consumer debt,’ which is a very polite way of saying that the people who gave all these other people money, did so with NO SECURITY OR COLLATERAL!  And, they did so without asking, “Why are these people borrowing all this money at such obscene interest rates, year after year?” The answer is, of course, because they need to borrow it in order to maintain what they consider an acceptable standard of living.

A good corollary point, also not considered, is that people don’t generally borrow money under such crummy terms, and in such large amounts, unless they have used up all of their other liquid or liquefiable assets. That this was the case is shown nicely (or horribly) in the graph of personal savings as the fraction of household income from 1985 to 2005 below:

Figure 13: Personal savings as a percentage of disposable income from 1985 to 2005.

In fact, in the late 1990s, as you can see from the graph above, consumers began to hit a wall in their ability to take on debt, and this was ‘damaging’ to the house of cards that is the economy. At that time the true short term (unsecured) consumer debt was probably (realistically) around a trillion dollars, and the banks who had lent all this money absolutely had to have those credit card payments coming in, and preferably a little late, so they could tack on extra fees, and keep the revenue stream healthy. Unfortunately, people were tapped out of ready cash. There is, however, the important fact to consider (if you are a disreputable banker) and that is that about 2/3rds of the nation’s wealth is in homeowner real estate. That’s right, 2/3rds of all the value of the nation’s savings is in real estate that people mostly own to live in, or to rent to others to live in.

I also note that a few months ago, the US, for the first time in its history, went to a net negative savings rate; in other words, all the ready capital is gone (including most of the easily accessible equity in real estate).

Thus, we see the extension of the same no-responsibility, no-collateral lending practices that operated with credit cards, extended to real estate, begin at just this time. However, even that is not sufficient when you have pretty well gutted the value and fundamental productive capability of your economy (i.e., you have almost no manufacturing, you have mostly service industries, and you are running an inconceivable and unsustainable deficit in trade). The solution was to convert all this debt into securities and debentures, and then find  greater fools; in this case international banks and foreign governments who would buy this crap.

So, that $1.5 million home (whose price was driven up by a speculative frenzy enabled by ‘free and easy’ money for non-repayable loans) in an average neighbourhood in Orange County, California, is in reality owned by HSB, Deutschebank, Credit Suisse, China, the UAE, and who knows who else! And, not only is it not worth $1.5 million, it is (or soon will be) functionally worth NOTHING, because people have to be able  to buy it – and that requires that they have both money and access to credit; and they will soon have neither.

Figure 14: The incredible disconnect between price, earnings, dividends and probable real value of shares; and of  economic wealth as a whole.

And that brings me to the stock market. Look at the graph of market performance from 1870 to a few months ago (Figure 14). Now, if you assume a fundamental increase in productivity over and above what has been the historical average, starting around 1950 (the effective start of computerization, improved telecommunications and a large scale switch-over to automation in industry and agriculture) and you evaluate the actual growth in productivity based on dividends, or even draw a line between dividends and earnings, you will see a truly terrifying disconnect between price and real wealth. I’ve illustrated this with a yellow line that shows the likely real historical slope at which wealth is increasing. And that rate of increase is certainly not what you see below. So, if you want to know how big the market correction is ultimately going to be, then the minimum is to between 700 and 900 of the current DJIA. It may be lower, because some of the value of the healthy parts of the economy, such as  businesses which are currently sound, may be destroyed as a result of the fallout from lack of credit, lack of jobs, and consequently lack of money to buy goods and services.

What kind of cretin or fool could be persuaded that the real value of shares, real estate, or anything else in a stable, productive, non-speculative economy can increase by 70% to 80% in the time period from 1990 to 2008??? That would truly herald the arrival what another group of fools and idiots call ‘The Singularity’; a point in time (coming almost any day now, we’re told) where technological progress becomes, more or less, some large exponent of exponential! Belief in Singularities is common in people, because they certainly happen to individuals (windfalls) and to companies (Google). But historically, there is only one kind of singularity in technology, finance, and human civilizations, and that is the Negative Singularity, where a bubble collapses (tulips, the South Sea Trading Company, swamp land in Florida, hot stocks in the 1920s, or the Roman Empire).

Of course, government officials, stock brokers and bankers, don’t work lifetimes at their jobs anymore; they hop around like fleas from one dog to another. They might be able to hold five years of data in their heads, if we are lucky. But mostly, they exist in a world that endures only from quarter to quarter. I love that add from Scottrade on TV. I only remember one line from it, which goes something like “Our funds consistently outperformed the 5-year Lipper Average.” Five years is about it in the current financial world timescale. How long do you plan to be retired for, or to invest in your retirement, before you retire? Five years?????

Below are two graphs presented the way governments, bankers, stockbrokers and financial analysts like to present market data and, much more importantly, how they actually see the data in their tiny minds.

Figure 15: The DJIA  data as the ‘wizards of Wall Street’ want you to see it.

These data look very reassuring, because they are log plotted over very short periods of time. They do not reveal that what is in fact happening is the metabolic equivalent of uncoupled oxidative phosphorylation; lots of oxygen consumption, lots of substrate use (consumer spending), lots of heat produced (high share prices) and ZERO net production of ATP (no real economic growth or true increase in share value). The net effect of this for cells and organisms is death, and this is also true for economies and nation states that operate similarly.

In humans (and other animals) who suffer this kind of uncoupling, the desperate response is to turn to anaerobic metabolism, which is inefficient, inconvenient, and wasteful. The economic equivalent occurs at the point where every single productive investment vehicle is deemed unsafe and unsound, at which point people stop trying to make money, and start trying to just hold onto it. They do this in the time honoured way of buying precious metals, usually gold or silver, or other commodities or shares deemed bulletproof. This is the last-ditched effort to conserve wealth, because it means that much easier, more convenient, and more socially acceptable ways to deal with economic instability, such as investing in real estate, or even government bonds with low  (or in the case of Japan, essential no) interest rates, but presumably rock solid reliability, are no longer options. And, in looking at the Dow/Gold ratio as shown in the graph below, you see writ the final step in this decompensatory process, before Armageddon.

Figure 16: The Dow/Gold Ratio which is arguably the  more asccurate indicator of the real value of stocks.

The naive optimist will look at this graph and say, “Well, look here, gold prices have been trending down since 2004.” Yes, on this graph that is true because if you want data from 2004 through the present, you have to pay for it. Gold is at historically high prices and, as of yesterday, I can’t even imagine what the Dow/Gold ratio is now!

The reality of this is can be seen if we look at the real stock market returns over the same period which shows the typical displaced to the right delay in reaction, but which nevertheless maps the reality of an increasing number of investors fleeing every other kind of investment to ‘speculate’ in gold and: there is only one thing worse than speculation in gold; and that is when it isn’t speculation anymore.

So what can we expect? The short-term answer is, “I haven’t any idea,” and the longer-term one is, “A prolonged period of sheer economic hell at least comparable in intensity to the Great Depression of the 1930s.” What form it will take and what its particulars will be are not within my ken.

However, I can tell you that two almost certain consequences of this will be the emergence of a new crop of Hitlers, Stalins and Maos, as well as profound changes in the social and political order in all of the world’s large nation-states; and in many of the smaller ones as well – there accompanied by violence and chaos.

For those of you with liquid assets, the only thing I can suggest is to shelter them as best you can. It is a certainty that the US Federal Reserve will act to protect personal savings, and to bail out core financial infrastructure at any cost. And that cost will be further debasement of the currency. This will take many months, or even several years or more to play out, but it will escalate sharply as confidence in the currency is lost, and there is further erosion of productivity, and greater demand placed on a limited supply of precious metals, and other value-retaining commodities.

To quote the Bard:

“There is a tide in the affairs of men,

Which taken at the flood, leads on to fortune.

Omitted, all the voyage of their life is bound in shallows and in miseries.

On such a full sea are we now afloat.

And we must take the current when it serves, or lose our ventures. “

Truer words were never spoken.

- Mike Darwin, 16 June, 2008

Afterword: Death and Expired Morals – the Primary Causes of Our Ruin

I don’t want to fail to acknowledge that there is “a whole lot of stealing going on” in the old fashioned sense of the phrase. There is, in fact, significant unjust redistribution of wealth within the economies of the West, and this takes many forms, the most obvious of which are the insane compensation packages and salaries now given to corporate management. With Steve Jobs being an example of a  possible exception, managers have little to do with generating the wealth that the corporations they oversee produce. No corporate CEO is worth millions of dollars per year in compensation. In Jobs’ case, the bulk of his compensation has come from shares in the enterprise he founded, an enterprise which has returned value only to the extent that it actually performs in the marketplace – which again, under normal conditions, is a function of the desirability and performance of the goods and services that customers are willing to pay for. The so-called ‘Robber Barons’ of the late 19th century were piker’s compared to today’s insanely overpaid corporate thieves.

Figure 17: Health care costs as a function of biological ageing.

However, the real problems bankrupting us (and make no mistake – there will be no durable economic recovery), are an expired system of morals, a debased system of values, and the resultant infusion of an inconceivable and unsustainable fraction of our wealth into the machinery of death. Ironically, one of the biggest capital-consuming death machines is the healthcare industry. No doubt, that will come as a shocking and counterintuitive statement to many, but is nonetheless a hard reality, and an artifact of our ‘halfway medical technology’, a kind of medicine first identified by the physician-author Lewis Thomas in 1974:

“Halfway technology represents the kinds of things that mustbe done after the fact, in efforts to compensate for the incapacitatingeffects of certain diseases whose course one is unable to dovery much about. By its nature, it is at the same time highlysophisticated and profoundly primitive… It is characteristicof this kind of technology that it costs an enormous amountof money and requires a continuing expansion of hospital facilities…It is when physicians are bogged down by their incomplete technologies,by the innumerable things they are obliged to do in medicine,when they lack a clear understanding of disease mechanisms,that the deficiencies of the health-care system are most conspicuous…The only thing that can move medicine away from this level oftechnology is new information, and the only imaginable source ofthis information is research. The real high technology of medicine comesas the result of a genuine understanding of disease mechanisms andwhen it becomes available, it is relatively inexpensive, relatively simple,and relatively easy to deliver.” —Lewis Thomas[3]

Figure 18: At left, rapidly diverging and increasing growth in the cost of health care relative to the GDP, and at right, the absolute costs of health as a fraction of the GDP from the 100 year period from 1970 to 2070. Under no foreseeable circumstances is either trend sustainable.

The problem with halfway medical technology is that it doesn’t really work in the presence of aging and progressive degenerative disease. To try to treat such conditions without curing them is to experience exponentially greater costs, with comparably lower rates of return. In the parlance of auto ownership, it is to possess a car that has become impossibly expensive to maintain, and has thus become a ‘junker.’ Such cars are, of course, sent to the scrap yard. In medicine there is a name for this too: euthanasia.

At present, roughly 2/3rds of every healthcare dollar is spent in the closing decade of life, and perhaps as much as 60% of that expenditure is to pay for futile attempts to prolong the lives of clearly moribund patients – in other words, money spent in the last year of life. Not only is this a very unrewarding expenditure of resources for all involved, it is also not sustainable

“In the parlance of auto ownership it is to possess a car that has becomes impossibly expensive to maintain, and has become a ‘junker.’ Such cars are, of course, sent to the scrap yard. In medicine there is a name for this too: euthanasia.”

Currently, US health care costs are consuming a staggering 16% of the GDP! This is up from 12% of the GDP in 2008! These expenditures are not sustainable, and what is more, they represent the expression and use of a moral system that has expired, and long ago began to sour.  The position of cryonicists that individual human lives are invaluable and worth saving, not just ‘temporarily,’ but indefinitely, represents the new and proper moral position for the technology we now command. Nothing can be done to save the lives of those of us who have exceeded the reach of our current, mostly halfway, medical technology – not even at the price of bankrupting the economy of the entire Developed world: Nothing, that is, with exception of the development of reversible brain cryopreservation. Only by creating an enabling technology that allows patients – all patients – who have exhausted current medical technology, to continue their journey to a future committed to rescuing them, will work. The only alternative is to turn our hospitals and extended care facilities into death camps that will operate on a scale that would shame and humiliate Mao, Stalin or Hitler.

We know that this is so, but unfortunately, they don’t, and they aren’t about to figure it out anytime soon. To develop reversible brain cryopreservation for humans would consume a trivial sum, compared to what is spent on delivering even a single futile halfway medical technology, such as caring for patients with end-stage Alzheimer’s disease for a single year. And yet, it won’t be spent. And if such technology were developed tomorrow, it would not be used. Regrettably, we are much further off from any reasonable prospect of developing reversible whole-body suspended animation, by cryopreservation or any other means – something that the brighter bulbs in this civilization might be able to accept and embrace. Thus, the die is cast, and the system will crumble, or be transformed in ways that are unthinkable to any feeling person who values individual human life. We must organize and prepare ourselves for this almost certain eventuality, and do what we must to ensure we do not succumb to the carnage, as well.

Figure 18: US military spending in current (2011) US dollars.

If you are wondering where a large fraction of the rest of the death machine’s operating dollars come from, look no further than to military expenditures. I am no ‘kumbaya singing pacifist,’ but these expenditures are insane, and they were completely avoidable. Had the US decided to construct Fischer-Tropsch plants at the start of the Arab Oil Embargo in the 1970s, to produce diesel from coal, cut back on wasteful energy consumption, worked to develop technologies for extraction of natural gas from shale, and made a commitment to continued large scale development of nuclear power (with ongoing research into Tokomak, and other fusion power research) there would be no Arab-Islamist threat today and there would have been no 9-11. We made these vipers rich with our intellectual and cultural capital – as well as our currency. We empowered them. Without Western, and in particular US petrodollars, the Middle east would be as it has been for the past ~1,500 years: an impoverished and insignificant pocket of mendacious people with a poisonous ideology.

Figure 19: Schematic of basic Fischer-Tropsch fuel generating process.

The Fischer Tropsch process (FT) was developed by Franz Fischer and Hans Tropsch, working at the Kaiser Wilhelm Institute in the 1920s, in response to Germany’s crippling inability to get access to petrol during the end of World War I. When Germany lost access to imported oil during World War II, the FT-process was used by the Nazi government to produce substitute fuels. F-T production accounted 25% of the automobile fuel used in the Third Reich, with technology that is now 70 years old![4] The largest implementation of F-T technology in the world today are the plants operated by Sasol in South Africa. South Africa, like the US, has large coal reserves, but virtually no oil, and with the imposition of anti-apartheid sanctions, South Africa had no choice (so long as it wished to continue apartheid), but to develop a means to produce liquid hydrocarbon fuels from coal. This they did, and with enormous success, and they are now, and have been, independent of Arab oil. [Sasol, the South African FT fuel production company, uses coal (and now natural gas) as feed-stocks and produces a variety of synthetic petroleum products, including most of the country's diesel fuel.]4

The whole sorry mess of the seemingly endless ‘ongoing crisis in the Middle East’ was thus avoidable. If we had but had the foresight, and the supporting structure of morals and values to take the responsible course of action, in response to the extortion that was the Arab Oil Embargo everything would be different today. In October of 1973, the members of the Organization of Arab Petroleum Exporting Countries  (OAPEC) proclaimed an oil embargo “in response to the U.S. decision to re-supply the Israeli military” during the Yom Kippur war. This embargo lasted until March 1974.[5] As a consequence of the embargo, oil supplies were severely disrupted, and a deep recession was triggered througout the Developed World. Another consequence of the embargo was a serious fracture within NATO, which resulted in costly expenditures in foreign aid and defense spending. The unacknowledged price of the embargo was that the US had ‘caved’ by continuing to buy oil from OAPEC thus allowing the Arab states to increasingly determine global oil prices by arbitrarily controlling supply at critical points in time. A corollary of this was that OAPEC also now had a significant voice in determining US, and thus Western, foreign policy.[6]

The idea that the moral choices this civilization has made are at the root of our current predicament, and are very likely to cost all of us our lives if not reversed, is one causes of the fatal trajectory we are on. The second, and corollary cause, is that the core values of this civilization are at once deathist, and bankrupt. These are ideas that we will be exploring here in the near future. And not just exploring, but proposing viable alternatives to – alternatives that express the values and ideals of a cryonics movement about to be reborn.

- Mike Darwin, 08 February, 2011


[1] There Ain’t No Such Thing As A Free Lunch.

[2] The physician Lewis Thomas described it in this way: “Halfway technology represents the kinds of things that mustbe done after the fact, in efforts to compensate for the incapacitatingeffects of certain diseases whose course one is unable to dovery much about. By its nature, it is at the same time highlysophisticated and profoundly primitive… It is characteristicof this kind of technology that it costs an enormous amountof money and requires a continuing expansion of hospital facilities…”

[3] Thomas L. “The technology of medicine.” In: The Lives of a Cell. New York, NY: Viking Press; 1974:31–36.

[4] Leckel, D. Diesel Production from Fischer−Tropsch: The Past, the Present, and New Concepts. Energy Fuels, 2009, 23  (5), pp 2342–2358. DOI: 10.1021/ef900064c Publication Date (Web): April 16, 2009

[5] How We Got Here: The 70’s the Decade That Brought You Modern Life–For Better or Worse. New York, New York: Basic Books, 2000. p. 318. ISBN 0465041957.

[6] Licklider, R.  ”The Power of Oil: The Arab Oil Weapon and the Netherlands, the United Kingdom, Canada, Japan, and the United States”. International Studies Quarterly (International Studies Quarterly, Vol. 32, No. 2) 1988;32;(2):205–226, [p.206]. doi:10.2307/2600627. http://www.jstor.org/stable/2600627.

http://i293.photobucket.com/albums/mm55/mikedarwin1967/London7.jpg
]]>
http://chronopause.com/index.php/2011/02/08/london-at-apogee-a-reflection-on-the-criticality-of-life-affiriming-values-to-economic-viability-and-personal-survival/feed/ 19