Thursday, January 31, 2013

Minimum Population to Support High Tech?

One of the plot problems that science fiction authors deal with regularly concerns the size of the population necessary to support a given level of technology.  I recently reread James Blish's Cities in Flight novels (originally published between 1955 and 1962).  In those stories, a fair-sized chunk of New York City has been converted into a stand-alone spacecraft capable of interstellar flight, visiting planets and selling knowledge-based services to the local settlers.  Blish assumes certain technologies that reduce the need for people.  The City Fathers are self-programming self-repairing artificial intelligences that handle a wide variety of activities (eg, teaching once students reach a certain level).  Processed algae grown in vats provides the necessary food, minimizing the number of people involved in agriculture.  Even so, Blish recognizes that a city with a couple million people is not truly self-sufficient.  New York takes payments in radioactive elements to power its reactors, and accepts immigrants with useful skills.  Most of the other-world colonies they deal with been unable to maintain the level of technology the colonists arrived with.


One of the consequences often associated with a peak in the availability of liquid fuels [1] is that the world becomes a "bigger" place.  Long-distance trade declines; goods and services must be produced more locally than they are at present; certain economies of scale are lost.  Such restrictions must inevitably have an impact on the range of goods and services that can be produced.  To carry the idea to an extreme, a village of 100 people that is isolated and must be self-sufficient will produce very little beyond hunting, gathering, subsistence agriculture, and minimal shelter and clothing.  Even if one (or all) of the 100 knew everything about our current drugs for treating cancer , the village would not be able to build all of the infrastructure required to manufacture such drugs.  Instead, the tech of concern is more likely to include things familiar to the ancient Egyptians [2], such as drop spindles for spinning thread/yarn and simple hand looms.  So, as a hypothetical question, what's the minimum population necessary to support today's level of technology?

What defines "today's" technology?  I suggest (and this is a subject about which reasonable people can certainly disagree) that the defining piece of tech in the contemporary world is the ultra-large-scale integrated circuit.  Everyone (well, certainly everyone reading this) comes in contact with it every day.  Our computers, mobile phones, televisions, modern medical scanning and even cars all depend on such circuits [3].  The ability to produce them requires a very high level in several different disciplines: mechanical engineering, chemical processing, and software to name three.  If a society is capable of designing and producing such circuits, the rest of high tech follows.  Take away those billion-transistor circuits and you have to settle for much less advanced alternatives in most tech areas.  For example, purely analog television can be done with vacuum tube technology; but digital television requires integrated circuits performing hundreds of millions of calculations per second to decompress the bit stream, resize the image to match the display device, etc.

The ability to manufacture state-of-the-art integrated circuits sits at the top of a number of pyramids.  The people who design and fabricate the devices require enormously specialized education, so there needs to be an education pyramid.  The fab lines themselves are miracles of mechanical engineering, positioning different pieces of the machinery to an accuracy measured in billionths of a meter.  Incredibly pure chemical compounds are required, hence entire chemical engineering and mining industries.  Water at (for most other purposes) ridiculous purity levels.  Large electricity inputs.  Billions of dollars in capital, implying sophisticated financial institutions.  The engineers and teachers need medical care, and dry cleaning, and day care for their children, and everyone needs various forms of entertainment.  Police and fire fighters.  Construction workers.  Insurance agents.  Plumbers.  Farmers.

One thing too often unremarked is that all of those "support" personnel have to be rich enough to afford products built around the integrated circuits that are being built.  If "...there is a world market for maybe five computers" [4] is true, no one is going to spend billions on a fab line plus all of the ongoing expenses to operate it.  There has to be a market for millions of devices, and more likely tens of millions.  That rules out one whole group of science fiction scenarios, where a small technologically advanced aristocracy lives on the backs of millions of serfs.  At least, it rules those scenarios out until such time as the tech itself can become self-sustaining (in which case, one wonders why the aristocracy would bother with all those serfs).

Based on all of the above, my own guess as to the minimum population necessary to support production of integrated circuits at today's level of complexity is between 30 million and 50 million people, to provide the capital, the markets, and all of the human infrastructure.  In addition, I would guess that there would need to be at least one urban/suburban complex comprising at least three to five million of those people.  Cities appear to be a necessary, but not sufficient, condition for developing and maintaining tech.


[1] I'm not arguing either side of the Peak Oil debate here.  Nor am I asking questions about sustainability in the sense of natural resource use.  I'm just asking a speculative question about the relationship between population and technology.

[2] Image from Ancient Egyptian and Greek Looms, H. Ling Roth, Bankfield Museum, 1913.

[3] General Motors says that two-thirds of the engineering expenses for a recently developed hybrid automotive transmission was for developing and debugging the software that ran on a dedicated embedded processor and made the transmission possible.  Modern jet fighters are designed to be unstable in flight in order to improve maneuverability; the pilots' controls provide "suggestions" to the software that actually flies the plane.

[4] This quote is generally attributed to Thomas Watson, head of IBM, although there's no evidence that he ever actually said it.

Thursday, January 24, 2013

Depressed Over CES

A former colleague of mine visits the Consumer Electronics Show in Las Vegas every January, then writes up and distributes a report on what interesting (and silly) things vendors have on display.  This year's report has left me quite depressed.  Not because firms aren't creating interesting (and silly) things -- but because I've been in the market for a particular piece of consumer tech for the last 20 years, and it's beginning to look like I may die before it's available. (Note: this is sort of an annual complaint for me.)

For more than 30 years, I've carried a little black notebook around.  It takes 8½ by 5½ inch paper with three holes punched appropriately.  All sorts of things are stored in the notebook: calendar, phone numbers, bits and pieces from ongoing projects (the 3D plot shown, for example).  The pockets in the front and back cover are stuffed with the various pieces of paper that one picks up over the course of the day/week/month.  Most importantly, though, I take occasionally copious notes in my cramped little handwriting.  I'm a mathematician originally and have done lots of technical things over the decades, so "notes" often includes math, graphs, and lines-and-boxes drawings, as well as plain text.  I want a computer replacement for my little black notebook.  TPTB have been telling me that what I want should be available Real Soon Now for a very long time.

Ideally, I want something in about the same form factor.  I don't mind that I can't stick it in my pocket; I can't stick the notebook in my pocket either.  When it opens up, I'd like two screens.  The left one can be an e-ink sort of display like those used in e-book readers; the right one needs to be the same physical size, but more along the lines of a tablet's LCD screen, but capable of high-resolution input with a stylus.  One of the applications would be to have a book open in the left screen, and a piece of paper corresponding to each page in the book that I can jot notes on in the right screen.  Plus all the other things that my notebook does for me.

I had great hopes for what Microsoft's Courier product might have become.  It folded, had two screens, at least one of which appeared to allow high-resolution stylus input, all in about the right-sized form factor.  Unfortunately, Microsoft closed the project down before the product was released [1].  The Galaxy Note 10.1 that my colleague saw at the show has potential, but the screen's still too small (an 8½ by 11 inch piece of paper has a diagonal measure of just under 14 inches) and pictures of what people have written on it all look like they're using a grease pencil, not my preferred medium-point Parker ballpoint.  The YotaPhone  is a somewhat outsized smart phone with a conventional color LCD display on one side and a monochrome e-ink display on the other.  The Boogie Board Sync allows uploading a sizable page of drawings and text from a stylus at better resolution than most touch devices, but doesn't support data flow in the other direction.  Plastic Logic has some really neat display and sensor technology, but hasn't combined them yet -- and no one seems to be building products around that technology.

Sigh... I suppose I'm just a fossil.  If only I'd get with the program -- I'm supposed to consume content, not create it.  Or at least confine my creation to exciting new things, like speech and image and video.  Not something as old-fashioned as drawing/writing fine lines on paper.


[1] Various reports suggest that the project was killed because it didn't fit the Windows/Office model that drives Microsoft profits.

Wednesday, January 23, 2013

Nuclear Isn't "Our" Opportunity/Problem

From time to time, I come across pieces written by people who describe commercial nuclear power in the US as "our" problem or "our" opportunity, where "our" means all parts of the country.  A couple of years ago I put up a short piece noting that existing US reactors were largely located in the eastern half of the country.  This post continues that thought, with some corrected numbers, and poses the question of whether "our" should realistically be interpreted as meaning the entire US in a nuclear context in the future.  To be blunt about it, I'm going to argue that some parts of the country receive most of the benefits of nuclear electricity; some parts have been expected to accept the long-term problems of nuclear electricity; and the two parts don't align very well.  In the long run, if that misalignment continues, it is apt to be an ongoing source of regional friction.

Any discussion of electric power in the US has to start from the fact that there are three largely independent power grids: the Eastern, Western, and Texas Interconnects.  The states of the Western Interconnect correspond quite closely to the definition of "West" that I use from time to time [1].  Only eight of 104 commercial US reactors are located in states in the Western Interconnect.  In 2011, the Western Interconnect's nuclear generation accounted for (a) 9.8% of the Western's total generation and (b) 9.2% of all US nuclear generation.  By contrast, the Eastern Interconnect's nuclear generation accounted for (a) 23.3% of the Eastern's total generation and (b) 85.8% of all nuclear generation [2].  In short, the Eastern Interconnect is much more dependent on nuclear power and accounts for the vast majority of nuclear generation in the US.

The largest downside to nuclear power is the problem of nuclear waste.  For the last 25 years, the only site which has been considered for long-term storage of that waste was the proposed Yucca Mountain repository in Nevada.  Other than the area immediately adjacent to Yucca Mountain (which anticipated a local economic boom), Nevada has opposed the siting of the long-term waste repository.  One of the arguments that has been put forward by the opposition is that Nevada should not have to accept the perceived risks associated with transport and storage of the waste when Nevada has received none of the benefits of nuclear power -- there are no commercial reactors in the state.  In 2011, funding for the Yucca Mountain facility was discontinued, and no alternative sites have been suggested.

This East/West contrast seems unlikely to change.  The Nuclear Regulatory Commission published a map, shown to the left, identifying the locations for proposed new reactors [3].  None of those reactors are in the Western Interconnect.  Additionally, of the eight reactors in the West, the two at San Onefre, California have been off-line for the last twelve months and there is discussion about whether those reactors should ever be put back into full-scale service.  Looking forward, then, we see increased rates of waste production outside of the West and potentially decreased rates of production within the West.

Will a new waste repository be sited in the West?  It seems likely that any such proposal would be challenged in the federal courts, probably going all the way to the Supreme Court.  The Roberts court has, this past year, ruled that there are things which the federal government cannot coerce states to do involuntarily -- a philosophical change in direction from the historical trend.  I suspect that accepting large amounts of nuclear waste from other states will turn out to be one of those things.  Which leaves the states of the Eastern Interconnect with a problem.  Not "our" problem in a broad national sense, but an Eastern (and to a lesser extent a Texas) problem.

It's fair to ask the question, "Even though they haven't acted yet, won't the West also need more reactors?"  There have been several relatively detailed studies published describing how the Western Interconnect could meet its needs from renewable sources alone.  Several of those start with a foundation of hydroelectricity, with which the West already has considerable experience.  Over the last decade, the Western Interconnect got 20-30% of its electricity from conventional hydroelectric power (depending on how wet the year is).  That's two to three times as much as the West gets from nuclear reactors.  A 2006 DOE report (PDF) considered, among many things, the amount of undeveloped hydro power in the US, by state, shown graphically to the left (you should be able to do "View Image" or your browser's equivalent to see a larger version).  In aggregate, the states of the Western Interconnect have considerably more undeveloped hydro power than has already been developed.  Yes, a batch of new transmission capacity would be needed to move electricity to the demand centers.  Yes, the availability of hydro power varies somewhat over the course of the year.  And yes, big dams aren't harmless.  But the potential is considerably larger than the current nuclear generation in the West.


[1]  The 11 states of my West are Arizona, California, Colorado, Idaho, Montana, Nevada, New Mexico, Oregon, Utah, Washington, and Wyoming.  The interconnect boundaries don't follow state lines exactly.  In particular, the Western Interconnect excludes much of eastern Montana and includes the area around El Paso, Texas.

[2] These figures are derived from the EIA's annual report on total generation by state and source for 2011.

[3] All NRC licensing decisions are currently on hold, while the Commission completes an environmental study on the impact of high-level radioactive wastes.

Thursday, January 17, 2013

North American Monsoon

One of the problems with the global climate change models continues to be lack of resolution.  Most people who don't live in the US Southwest are unfamiliar with the North American Monsoon (NAM).  It's not as well known as its counterparts in India or Africa, but accounts for a significant amount of the annual precipitation in the northwest parts of Mexico and the southwest parts of the US.  I live outside Denver and the strength of the NAM can have dramatic effects.  The 2011 NAM was pronounced.  It rained at my house every day for the first two weeks of July, which caused me a great deal of anxiety: my daughter's outdoor wedding ceremony was scheduled for July 17, and I was having nightmares of another 6:00 PM downpour [1].  Watering outdoor plants wasn't necessary that month; some of the municipalities around Denver reported water usage was only half of the normal July amount.

Until recently, the analyses from the big climate models have not addressed the issue of localized climate features such as the NAM.  A paper submitted for publication by Ben Cook and Richard Seager considers the likely consequences to the NAM over the course of this century.  Fundamentally, they forecast a shift in the timing of the NAM -- instead of being a June-July occurrence as it is now, it will become a September-October phenomenon.  The total amount of annual precipitation from the NAM is essentially unchanged.

Their prediction is focused on the main area of the monsoon, which is farther south than Colorado.  In the draft paper, they note that predicting the results for the more northern areas affected by the monsoon will be harder.  One of the principle forcing conditions for the changes in the monsoon timing in their model is decreased winter and spring precipitation in the area.  Colorado is in a peculiar position in that regard; other global models predict that northern Colorado will experience increased winter precipitation and southern Colorado will have decreased winter precipitation.  Denver sits between those changes; small north-south errors in predicting winter precipitation changes could have large effects on the forecasts for precipitation later in the year.

Agriculture in almost all of the western US has depended on water storage and distribution from the beginning [2] -- in essence, both time- and place-shifting the available precipitation.  Changes in the timing and strength of the NAM might require changes in storage capacity and the timing of releases.  Whether such changes would be a sufficient coping mechanism appears to remain a very open question.


[1] It all turned out well.  The thunderstorms stayed back over the high country that day, and all we got at the wedding was a nice cooling outflow breeze.

[2] Even areas with relatively large amounts of precipitation such as Oregon's Willamette Valley have significant seasonal problems.  Phoenix and Denver are wetter than Portland, at the north end of the Willamette, during the months of July and August due to the NAM.

Sunday, January 13, 2013

State of the Blog

I promised myself that, should this blog ever reach 10,000 page views, I'd write a short post about how things were going [1].  That happened earlier this month, so here goes.

Almost no one leaves comments.  This could indicate any of several things.  I might not be controversial.  Most of the page views could be robots scanning and indexing the Web.  Probably it's just hard to build a community of people willing to make comments.  I wonder if blogger or any of the other blog-hosting sites publish information about how many blogs remain essentially comment-free?

By far the most viewed entry is this one about donor and recipient states of federal tax dollars.  If I type "donor and recipient states" into Google, that post comes up at the top of the list (hey, 15 minutes of Internet fame!).  There has been a surge of interest in it since the elections in November.  I expect that's a response to the large number of signatures showing up on secession petitions at WhiteHouse.gov; the conventional wisdom seems to be that the petitions come from "red" states; people in "blue" states like to point out in response that many of those red states are net recipients of federal dollars, so just let them go.  It seems odd to me that my post should be so popular, since the general theme of it isn't about red/blue comparisons, but about East/West.  I also did a follow-up piece about the difficulties of using simple totals to label western states as "recipients" in the derogatory tone that is often taken.

The second most popular entry is about The World's Most Sophisticated Whole-House Fan Controller™™.  Clearly, some people are not happy with the controllers they have for their whole-house fans and are looking for an alternative.  I suppose there might be a business opportunity in there somewhere.  Although a smart controller is going to be a bit pricey.  Getting-started costs are high, since the cost of having a unit listed by Underwriters' Laboratories runs to at least several thousand dollars [2], and the market over which that expense would be spread is probably too small.  Nevertheless, the controller is a good example of one of my recurring themes: the prevalence of large-scale integrated circuits at the heart of an enormous range of devices.

I haven't written as much about energy policy as I intended.  One of the things that I had hoped would happen is that the blog would motivate me to investigate (and possibly build) models that incorporated regional effects of localized energy resources.  It turns out that I believe in several things: (a) when historians look back at the first half of the 21st century, energy transition will be the important story; (b) electricity supplies will be the biggest part of that story; (c) the story will have turned out much better for some areas than for others; and (d) the areas where it will turn out better will almost certainly not be constrained by national borders.  Too many models that I've seen -- and that list is certainly not exhaustive -- deal with global or continental sorts of scales, which I think is too broad.

All things considered, I think I'll continue to muddle along.  If nothing else, writing is an excellent way to organize your thinking on a subject.


[1] Blogger's graphic shows page views per month.  It's kind of odd, since I didn't start this blog until the middle of 2010, and it shows page hits going back to 2008.

[2] Just the cost of getting a copy of the relevant UL documents describing the requirements to which such a device must conform runs to several hundred dollars.

Thursday, January 3, 2013

Playing With a Simple Depletion Model

Shale oil plays are a hot topic these days.  James Hamilton at Econbrowser points at a variety of presentations made recently at the American Geophysical Union meeting.  Some analyses are optimistic: Citigroup forecasts that by 2020 the US will be producing almost four million barrels per day of shale oil, with almost a million barrels per day from the Bakken Shale in North Dakota and Montana alone.  Others are much more pessimistic: David Hughes asserts that a more complete analysis suggests that Bakken production will indeed reach almost a million barrels per day in about 2017, but will then decline rapidly to only 50,000 barrels per day by 2025.

Several factors go into such analyses; most of those that appear to account for the difference between the optimists and pessimists are based on the depletion rate for production from the wells in a shale oil reservoir.  Production from any specific oil well declines over time.  The depletion rate is a measure of how rapid the decline is.  It is typically expressed as a percentage: production declines at x% per year, or at y% per month.  If the depletion rate is 10% per year, and the well is producing 100 barrels per day today, a year from now it will produce 90 barrels per day, two years from now it will produce 81 barrels per day, in three years about 73 barrels per day, and so on.  The experience so far is that wells in shale oil plays have very high depletion rates, in some cases in excess of 50% per year [1].

The rest of this discussion is based on the graph shown above, produced from a simple depletion model.  The vertical axis is total production from many wells [2], in barrels per day.  The horizontal axis is months from a time zero, when drilling is assumed to begin.  The red curve is for a hypothetical baseline case that assumes conditions typical of what we are currently experiencing: 125 new wells drilled per month, initial production of 500 barrels per day per well, and a decline rate of 3% per month.  Starting from zero, production climbs steadily, flattening out at about two million barrels per day after 180 months (15 years).  At that point, the production from 125 new wells per month simply offsets the decline in production from the wells that have already been drilled.

The "collapse" shown after 180 months represents what happens to production if you were to stop drilling new wells at that time: an exponential decline of 3% per month.  For example, for a particular field, that exponential decline kicks in when you run out of places to drill.  Some experts believe that a decline rate of 3% per month is too optimistic.  The green curve shows what happens if the rate is 4% per month instead.  Maximum production tops out at only 1.5 million barrels per day, and the decline if/when drilling is stopped is noticeably steeper.

One can make the argument that oil companies will choose the best sites for their initial wells, and that over time the quality will decline.  The blue curve shows what happens if, in addition to using a 4% per month decline rate, the initial production from new wells also declines at 2% per year.  That is, during the first year new wells start producing at 500 barrels per day; in the second year new wells produce at 490 barrels per day; and so on.  In this case, production peaks after about seven years at 1.37 million barrels per day and begins to decline slowly.  New, lower-quality wells don't completely offset the decline of existing wells.

A final argument made by the pessimists is that as the quality of the wells declines, the rate at which wells are drilled will also decline.  That is, some investors who will pay to drill a well that delivers an initial production of 500 barrels per day won't pay for a well that produces only 400 barrels per day.  The violet curve adds a decline rate for new drilling of 2% per year.  Now production peaks after about five years at 1.27 million barrels per day, and after 15 years has dropped below a million barrels per day.

Different assumptions, very different predictions.



[1] If production from the well is throttled to less than its potential, the depletion rate may be zero for a considerable amount of time.  Eventually, though, production will still decline.

[2] The code for the model drops wells once their production drops below a barrel per day.

Wednesday, January 2, 2013

Waste Not, Want Not

Oil wells typically produce both oil and natural gas.  While there are multiple ways that oil can be moved from a well to a location where it can be processed -- pipelines, rail car, or tank truck, either individually or in combination -- the only practical way to move the natural gas if you want to process it for commercial sale is by pipeline.  There are sometimes other local uses for it, such as reinjection into the oil reservoir to help maintain pressure.  If a pipeline is lacking, though, the easiest way to dispose of the gas is by flaring -- separate it from the oil and burn it in a more-or-less controlled fashion.

For a considerable time, flaring was relatively uncommon in the US.  The natural gas was valuable, the number of new wells was moderate, and it wasn't that expensive and/or time consuming to extend gas pipelines.  The recent rapid development of shale oil has changed that.  Gas flaring is becoming much more common.  So much so, in fact, that the flaring in two areas is visible from space at night.  The image below is a composite night time image assembled by NASA.  While the urban areas that are traditional in such images are still clearly visible, gas flaring from two major shale oil developments is also visible.


Flaring from Bakken shale production in North Dakota and Montana is a roughly circular blob.  Flaring from the Eagle Ford shale in Texas is a long arc.  There are a number of reasons that flaring is being used to dispose of the gas in these areas.  Production of shale oil involves a large number of wells, so the collection pipeline networks will be large and complex.  Wellhead natural gas prices in the US are currently very low, making it more difficult to recover the costs of those collection networks.  The Bakken field is quite distant from large-scale gas processing facilities, adding to the expense.  And in the case of the Eagle Ford field, much of the gas contains significant amounts of hydrogen sulfide, a toxic gas that has to be removed and disposed of if the gas is to be used commercially [1].  Combined with low prices, it is quite possible that Eagle Ford oil producers would have to pay someone to take the gas.

While the Bakken is quite isolated, the Eagle Ford activity is occurring relatively close to the city of San Antonio (indicated on the map).  Concerns have been raised that combustion products from the Eagle Ford flaring will add enough to San Antonio's local air pollution to put the city in violation of federal standards for ozone.  In the meantime, Bakken flaring is estimated to burn about 100 million cubic feet of natural gas per day, roughly the demand for gas from a city of 500,000.  Or, expressed in more local terms, enough to meet the needs of 70% of the population of North Dakota.

I understand that the oil companies that hold leases in the Bakken and Eagle Ford areas are under pressure to produce oil in order to generate cash flow to cover their costs.  But at a larger level, do "we" need that particular oil so desperately that we should waste that much natural gas?  Or to burn it so carelessly that we run up the air pollution levels in nearby cities?


[1] Flaring converts the hydrogen sulfide into less toxic compounds.