Saturday, December 22, 2012

Early 2013 Political Predictions

I'm a terrible predictor of political maneuvering (probably because I want it to make sense, which it seldom does).  As a public service, then, I'll make my predictions on the outcome of the current tax and debt ceiling fiasco -- just so everyone knows what's not going to happen.
  • No deal this month.  The conservative Republicans in the House think they have lots of leverage in multiple ways come the first of the year and won't vote for anything that has a prayer of getting passed by the Senate Democrats.
  •  Come January 3, the new Speaker of the House will be someone other than Rep. Boehner.  The new Speaker will be more to the liking of the conservative Republicans.  Since they think they hold a winning hand, they'll hold out for someone who is not as willing to reach a compromise with the President.  No Republican will vote for someone nominated by the Democrats, so the matter will be settled when the more moderate Republicans hold their nose and vote for the conservatives' choice.
  • Rep. Paul Ryan will be the new chair of House Ways and Means, with the power to dictate what goes into any tax bill.  White House negotiations with the House on tax matters will be with Rep. Ryan, not with the new Speaker.  No matter how things look in public.  The negotiations will fail to reach a compromise.
  • The conservative Republicans believe they have enormous leverage because the debt limit is going to have to be raised again.  At some point -- I would think a few days prior to the State of the Union address -- the President will make a national broadcast and announce that (a) Congress has left him with a set of contradictory laws, (b) he has instructed Treasury to continue selling bonds in excess of the debt limit as the least damaging way to resolve the contradiction, and (c) he hopes the House Republicans will come to their senses soon.  The State of the Union address won't even mention the debt limit.
  • Life goes on after the President's address.  Interest rates don't go up.  Treasury has no more problem selling bonds than they had before the address.  Public opinion strongly supports the President's action.  It becomes apparent to everyone -- except the conservative House Republicans -- that the leverage they thought they had was imaginary.
  •  The impeachment bill introduced in the House fails.  At least 17 Republicans refuse to sign on, believing (properly, IMO) that to do so would be a disaster for the party's election chances in 2014.
  • Over in the Senate, modest filibuster reform has been implemented.  As a result, Sen. Reid is able to bring bills to the floor, and for the most part, to a vote.  Senate Republicans are unwilling to stand in front of the camera where the public can see them and drone on for days or weeks to block a vote.  The House eventually passes some tax cut bill, the Senate passes a very different version of same, and the conference committee puts together something that the President will sign.  The same group of Republicans that stopped impeachment vote for the compromise so that, come 2014, they will be able to campaign on the tax cuts they supported.

Tuesday, December 18, 2012

Population Patterns, West and East

From time to time I write about differences between the western and non-western [1] parts of the US.  In an article from a couple of years ago that I came across in my search for cartogram software, esri published [2] a paper describing the value of "gridded" cartograms.  In such a cartogram, the variable of interest is measured at points on a regular grid rather than using higher-level aggregations such as states.  The use of a gridded cartogram allows other patterns to emerge.  One of the examples they give is a gridded cartogram for US population, shown here.


The distortions in the mesh overlay shows variations in population density.  In areas where the lines are widely spread the population density is high; where the lines are crowded together the population density is low.  The narrowest parts of the "waist" separating West from East in the distorted version of the US outline correspond to the Great Plains and the Rocky Mountain regions (Denver, pinned between those two, sits right on the waist).  Because of the very low populations in those areas, they are compressed to almost nothing in the east-west direction.  In the case of the Great Plains, this is part of a long-term depopulation trend, and in a cartogram like this one, that area will continue to shrink.

The map illustrates one of the differences between West and East.  In the western portion of the map, it is possible to identify all of the major population centers individually: Seattle-Portland, San Francisco-San Jose-Sacremento, LA-San Diego, Las Vegas, Phoenix-Tucson, the Colorado Front Range, and Utah's Wasatch Front.  Between those areas, with relatively minor exceptions, things get very empty.  In the eastern portion of the country, the population is spread more broadly.  A quick glance at a relief map of the contiguous states explains a good deal of the difference.  In the West, there are limited areas where cities are feasible.


I believe that the difference in population distribution makes it necessary to approach certain energy problems in very different ways.  In the West, the vast majority of the population lives in relatively small areas.  Converting freight transport away from highways and onto railroads -- one of the most commonly anticipated reactions to tightening supplies of liquid fuels -- is very different because of those population patterns.  The following map, from the US Department of Transportation, shows highway freight density for the 48 contiguous states.  In the West, the highest-density links connect the small number of population centers (or in some cases, link western population centers to the western edge of the East).  Additionally, the routes for those links follow the limits imposed by the landscape.  Many of the high-density links in the West follow the same paths as the great pioneer trails, for exactly the same reasons.  East of the Great Plains, the network becomes enormously more dense.



Two regions, two very different situations, two different sets of solutions needed.


[1] As usual, my West consists of Arizona, California, Colorado, Idaho, Montana, Nevada, New Mexico, Oregon, Utah, Washington, and Wyoming.  I'll say "East" for the rest of the 48 contiguous states (even though I know that ticks off the Texans).  Alaska and Hawaii are both so unique in their situations that very little of what I talk about applies to them.

[2] Benjamin D. Hennig, John Pritchard, Mark Ramsden, and Danny Dorling, "Remapping the World’s Population: Visualizing data using cartograms", ArcUser Winter 2010.

Friday, December 7, 2012

Your Descendents Will Likely Never See Your Digital History

I've written before about the risks of digital storage for family archival material.  The IEEE Spectrum has published a story that leads me to believe that I was probably too optimistic when I wrote that.  The main topic in the story is that a group of people appear to have solved the Pioneer deceleration anomaly.  The part of the story that interested me was how hard it was for that group to put together the data necessary to do the necessary analysis.

First, some background.  Pioneer 10 and 11 were unmanned deep-space probes launched in 1972 and 1973 respectively.  Both traversed the asteroid belt and made fly-bys of Jupiter.  Pioneer 11 used a gravity slingshot maneuver around Jupiter to enable a fly-by of Saturn as well.  After passing those planets, the Pioneer spacecraft became essentially ballistic objects headed out of our solar system, with their courses determined only by the effects of gravity from the sun and planets.  By 1980, however, measurements showed that the probes were slowing more than could be accounted for by gravity alone.  The anomaly was announced in 1998, and hundreds of papers proposing explanations for the slow-down were subsequently published.

In order to conduct a detailed analysis of one of the possibilities, it was necessary to have access to as much of the Pioneer navigation data as possible.  That data was originally stored on various media.  From the Spectrum story:
"As luck would have it, most of the Pioneer 10 and 11 telemetry data had been saved and were available for study. Although there was no requirement that NASA properly archive these records, it turned out that systems engineer Larry Kellogg, a contractor and former Pioneer team member at NASA Ames Research Center, had been informally preserving all the Pioneer data he could get his hands on. Kellogg already had nearly all of the two probes’ master data records, binary data files that contained all the Pioneers’ science and housekeeping data.
Kellogg had taken care to copy those records, which in total took up just 40 gigabytes of space, from soon-to-be obsolete magneto-optical discs to a laptop hard drive. When we decided to work with the telemetry data in earnest, in 2005, one of us (Toth) had already been in touch with Kellogg, working on new software that could extract useful information from the master data records without the need for an old, decommissioned mainframe.
Procuring additional Doppler data that could help solve the mystery turned out to be a bit trickier. The JPL team had already collected all the radio-science data files that were easy to find and work with, but we knew that we needed more measurements. It took some time, but we were able to find additional files on the hard drives of JPL navigators’ computers and the archives of the National Space Science Data Center. We even found magnetic tapes stuffed in cardboard boxes under a staircase at JPL. Some of the files were in a rather sorry state, corrupted while they were converted from one storage format to another over the span of three decades."

 Despite their best efforts, the authors (and others) recovered only 23 years worth of data for Pioneer 10 and 10 years worth of data for Pioneer 11.  The whole process illustrates exactly the problems that I suggested put family digital archives at risk: (1) will someone preserve the digital media, (2) will it be possible to extract the bits from the media, and (3) will there be software that understands the coding used for the data?  If NASA keeps magnetic tapes with the data from two extremely valuable space probes in cardboard boxes under a staircase, how many of my descendents are going to do any better?

NASA apparently did better with paper records.  The authors were able to obtain the original blueprints for the Pioneer craft -- designed in the days before computer-aided drafting -- for use in constructing a detailed thermal model of the vehicles.  Even with access to blueprints, it was necessary to consult retired engineers from the firm that built the Pioneers to obtain some details.  In addition to distrust of my descendents storage conditions, there's also a question of whether they could afford the same kind of recovery effort as the authors (funded in part by the Planetary Society, a nonprofit space-advocacy organization).  I'm going to have to rethink my preservation "strategy."  Encode the files simply.  Get copies into as many hands as possible.  And find someplace safe to store a paper version.

Oh, and the deceleration anomaly?  No new physics, I'm afraid -- it appears to be simply asymmetric heat radiation from the on-board plutonium-fueled electric generators.


Photo credit: Slava G. Turyshev

Wednesday, December 5, 2012

Cynical Thoughts on the B1G Expansion

There's been a fair amount written about the expansion of the B1G conference (I'm never going to be comfortable using that silly acronym for the Big 10, but it's what they use) from the perspective of football.  Lots of complaints about how dumb adding Rutgers and Maryland is, how it won't generate more revenue for the conference teams but will destroy a bunch of tradition as the conference schedule adjusts to accommodate two more teams.  I think that the people who are criticizing are ignoring what the end game for all of the conference realignments really is.

Put it this way: four sixteen-team super-conferences, each with two eight-team divisions, have the clout to tell the NCAA to get out of the way.  They can set up an eight-team playoff with the winner from each division.  The four first-round games go to the big bowls; the two second-round games replace the current BCS championship game; and the championship game happens a week later.  The NCAA doesn't get a vote, and the bowls don't get a vote.  "Play it our way, or we'll set up the eight-team tournament ourselves," is the only threat they need to make.  If the conferences want to be generous, they'll let the big bowls host the last three games on a rotating basis, much as the BCS championship is held at one of those locations each year.

The NCAA will roll over for this because they'll be terrified about what might happen to the men's and women's Division I basketball tournaments if they don't.  The 64 schools in these hypothetical super-conferences put a lot of teams into the first and second rounds of those tournaments.  The super-conferences alone could almost certainly run 24- or 32-team basketball tournaments of their own with only a modest drop-off in quality.  And they could threaten to make the NCAA even less relevant by offering the Dukes and Gonzagas the opportunity to align themselves with the super-conferences for scheduling and tournament play in basketball and other non-football sports.  I secretly suspect that Duke would probably be happier playing Division I basketball and FCS football.

Assuming this is the actual end-game, then adding Rutgers and Maryland makes more sense.  Geographic coverage for TV; opportunity for the Ohio State or Michigan or Nebraska alumni in those areas to get to games (and perhaps be inspired to make contributions); it may not improve the finances of the current conference schools, but it probably doesn't make them worse.  And to be honest, the super-conferences aren't necessarily looking to add the few unaffiliated football powerhouses.  All of the conferences have doormat teams, and adding an "easy" game to the schedule of the powerhouses, where the stars can sit early and not get hurt, isn't necessarily a bad thing.  If winning your division of your conference guarantees you a spot in the playoffs, running up the score (or other statistics) to impress the pollsters isn't a priority.

I admit that it's a cynical viewpoint.  But I feel comfortable betting that even if the four super-conference arrangement hasn't been discussed by a bunch of the athletic directors at the top football schools, it's floating around in the back of their heads.

Monday, December 3, 2012

Random Secession Thoughts

One of the topics that has been commented on recently in a variety of media is the number of petitions that have appeared at the "We the People" website the Obama administration operates following the November election asking that states be allowed to peacefully secede from the US and be an independent country.  There is now a petition asking permission to secede for each of the 50 states.  The number of signatures appearing on the petitions varies from state to state.  In the cartogram to the left, the size of the state reflects the number of signatures on that state's petition(s) for secession as of the middle of November.  A disproportionate share of the signatures are attached to the petitions for the red-shaded states in the southeast portion of the country [1].

Marc Herman has written a column giving people who want to have their state secede suggestions for how to mount a successful campaign.  Writing as someone who wants to have a particular piece of the US secede somewhere down the road [2] -- waiting until the time is ripe means more than 25 years out, less than 50 -- I think a few of his suggestions are good: make good economic arguments and avoid violence, for example.  One of the points Marc doesn't make is that on the side that's seceding, you need to be able to appeal to a broad spectrum of the population.  The American colonies had proponents of the revolution in both agricultural Virginia and industrial Massachusetts.  Given the timing of the petitions and all of the internet chatter, it seems safe to assume that the petitions represent "red" voters' displeasure with President Obama's reelection.  Let's see whether "unhappiness with the President's platform" meets the requirement for relatively broad support.

I'll use Georgia as an example of the problem that a secessionist would face.  Since the subject is approval/disapproval of the President's platform, start by considering the standard red-blue map of Georgia done at the county level.  Overall the state is predominantly red.  In a previous posting about cartograms, I noted (as have many, many others) that area isn't the same as people.  Moving down the maps, the red-blue cartogram in the next figure has scaled the counties by the number of votes cast for President last month.  The thing that really jumps out is the enormous expansion of the Atlanta metro area.  Outside of Atlanta the cartogram shows a lot of very small red counties and a smaller number of generally larger blue counties.  In short, President Obama appears to have done generally better in the more urban areas than he did in the rural areas.

The situation is even more interesting when we replace the simple red-blue coloring scheme with one that uses different shades of purple to represent the relative performance of the two candidates.  That is, a county in which President Obama received all of the votes would be blue, a county where Governor Romney received all the votes would be red, and one in which they split the vote evenly would be purple.  When you step back and look at this, the picture that emerges is one with urban areas in varying shades of purple, surrounded by much redder rural areas.  My interpretation (and as they say, your mileage may vary) is that there are thinly-populated rural areas where voters are strongly opposed to President Obama's platform, and urban areas where that platform has much stronger support -- a majority of voters in several sizable (by population) counties.

For someone who is serious about building a secession movement around dissatisfaction with the President, this is a problem.  The final gray-scale cartogram illustrates the aspect of the problem that I would be most concerned about.  In that map, both the size and the shade reflect the median household income in the county.  Counties with higher incomes are larger and lighter, counties with lower incomes are smaller and darker.  The spread is not nearly as pronounced as in the case of number of voters, but the overall pattern is still clear.  The Atlanta metro area is significantly richer than the rest of the state. This shows up in a number of ways within Georgia's state government.  One that is easy to find is the state's education equalization fund.  Georgia's fund, like that in many states, broadly implements an urban-to-rural subsidy.

The state's wealth (hence power) is concentrated in areas where opposition to the President's platform is not going to be well-received as a cause for secession.  A different cause, one that benefits Atlanta, would be necessary.


[1] The signature counts were as of roughly November 20, 2012.

[2] I don't draw my likely division in the same place that most of the people writing about secession draw theirs either.

Thursday, November 29, 2012

East Coast Energy Risks

CNN recently ran a print piece titled "Data shows East Coast gas shortages were inevitable".  The article points out that, given the situation before Hurricane Sandy, any largish disruption was going to create a gasoline shortage.  A number of reasons for the short term problem were cited: low regional reserves, recent refinery closures, heavy dependence on long pipelines, and the dependence of local retailers on the electricity being on in order to operate their pumps.  The author concludes that it will not be easy for the New York metropolitan area to avoid the same risk in future storms as well.

Nor was the situation confined to the New York region.  The area around Washington, DC also experienced wide-spread power outages as shown in the chart to the left (Credit: Washington Post), although they recovered much more quickly.  The risks are widespread throughout the BosWash urban corridor.  These are short term risks: high winds knock out power lines, storm surge floods other sorts of infrastructure, etc.  The more important long-term point alluded to in the article is that the region sits at the end of long pipelines, long power lines, long rail lines, and long shipping routes over which it receives the large majority of its energy inputs.

 The situation is likely to get worse over the next couple of decades.  The area is home to a number of aging nuclear reactors.  Yesterday, the New York Public Service Commission ordered Con Edison, the principle power provider for New York City, to develop plans to keep the power on in the event the Indian Point nuclear complex is shut down (Indian Point provides about 25% of the city's electricity).  The Indian Point operating licenses expire in 2013 and 2015, and renewals are likely to be held up both by procedures at the Nuclear Regulatory Commission and the political opposition of the governor.  Nor are the Indian Point reactors the only ones that are aging badly (see Oyster Creek's tritium leaks in New Jersey for another example).  Proposed alternatives -- new gas-fired generation, increased imports of hydro power from Quebec -- generally increase the dependence of the region on distant energy supplies.

 From time to time I get into arguments with people about the future of the US East Coast cities in an energy-constrained future.  The people I argue with assert that those cities are in the best position, because they use so much less energy per capita than, say, Mississippi or South Dakota.  My side of the argument is that those cities are very risky places to be, because while they may use less energy, they are dependent on a very large long-distance network of transport systems to get the energy they do use.  A modern city without electricity isn't a city any more; in fact, it quickly becomes uninhabitable as the elevators, refrigeration, water, sewage treatment and so forth quit working.

I'm not as pessimistic as John Michael Greer, but do anticipate a slow steady change in what America and the world look like as energy constraints begin to pinch.  In the long run, I expect (although I don't suppose I'll live long enough to see it) the US to separate into multiple independent parts.  One of the interesting aspects of that separation will be how BosWash behaves.  They're wealthy, they have tremendous political power within the current structure, but they are heavily dependent on a far-flung network to deliver the energy they need.  Whether they can keep the energy flowing over that network will be an interesting question.

Monday, November 26, 2012

Easily-Defeated Article Limits

It has become increasingly common for newspapers to limit free access to their content.  Both the New York Times and the LA Times restrict the number of articles that you can read each month unless you're a paying subscriber.  This being the Internet, people immediately began looking for ways to defeat the limit.  There are a couple of different ways to do it (they're widely known, so I don't feel like I'm costing either paper anything).  One is to periodically delete all of the cookies the sites have stored with your browser.  Another is to keep your browser from running scripts from those sites.

 Let me begin by remarking that both newspapers are doing their best to "have their cake and eat it too."  They want to make it easy for people to download articles without taking any extra actions.  For example, when I provide a link to a NY Times piece (which I have done), someone can follow that link and get a copy of the article immediately.  Unless, of course, the person following my link has already downloaded ten articles already this month, in which case the Times wants them to get a message that their free-article limit has been reached and they'll have to pay if they want to see that particular article.  How does the Times implement that check against the limit?

Based on what we know about blocking the check, there are two parts.  First, each time you download an article, a "cookie" comes with it.  A cookie is just a chunk of data that your browser stores.  In this particular cookie is a count of how many articles you've downloaded this month.  Whenever you make a request to the site that sent you the cookie, a copy of the cookie goes to the server as part of your request.  The Times' server increments the article count in the cookie and sends it back.  From what we know, the Times' server does not do the actual blocking; it just increments the article count.  As part of the article download, the Times also sends along a script -- a piece of code that your browser executes.  The script checks the article count in the cookie and blocks the display of the article if the count is too high.  We also know that if the cookie doesn't exist, the Times sends back a cookie with a count of zero.

Why implement the limit check in this fashion?  It makes things easier for the Times because all of the hard work is being done on your computer, not on their server.  For every request, the Times' server gets to do the same thing: increment the cookie counter, creating a new cookie if necessary, and download the requested page plus updated cookie plus script.  No checking against the database to see if you're a subscriber.  No generating a different sort of response.  This makes the server simpler, faster, and (IMO probably the deciding factor) cheaper.  It also makes it easy to defeat the limit: either delete the Times' cookies periodically or refuse to allow the script to run.  Deleting the cookie in order to defeat the limit does raise an ethical question (I'm intentionally taking action in order to read articles that the owner hasn't given me "permission" to read).  Keeping the script from running is more problematic.

There are good reasons to block scripts.  Scripts can find a lot of personal information about you and send it off to the bad guys.  In extreme cases, scripts can mess with your computer in bad ways.  Security advisers often recommend blocking script execution generally (the University of California at Santa Cruz guidelines for campus users is an example of such a recommendation).  If a person has blocked scripts, the Times' limit on the number of articles that can be viewed is defeated.  If a person were running the Firefox browser, with the NoScript add-in blocking execution of scripts, the Times turning on their article limit would have been a total non-event.  That person's perception of the Times' site would have been exactly the same after the limit was turned on as it was before the limit.  In effect, the Times is asking readers to operate their browser in an insecure fashion so that the Times can implement article limits cheaply.

The Times is essentially saying, "We're going to put articles up in a public place.  We request that you only read ten articles per month without buying a subscription.  We want you to remember your count and stop at the appropriate time.  We're going to count everything you read, no matter how trivial, no matter how you got there (including following bad links we provide), against the limit.  And we're not going to make a serious effort to keep you from reading past the limit."  That's not a business arrangement, that's a request for contributions.

Tuesday, November 20, 2012

Fun with Cartograms

A cartogram is a map in which the geometry is distorted so that the displayed area of a region matches some variable other than physical area.  Red/blue cartograms with US states distorted to reflect the number of electoral votes rather than the physical area become popular every four years.  Entire web sites have been created to distribute cartograms.  From time to time, I find myself wanting to generate a cartogram, but have lacked the appropriate software.  Last week I decided to do something about that.  I spent a day looking at various free packages available on the Internet.  Some wouldn't run on my Mac; some required learning obscure details of a complex user interface; some required map data in specific formats I didn't have available.

Ultimately, I decided to build my own little system around M.E.J. Newman's cart and interp programs.  The programs are written in vanilla C and compiled properly on my Mac [1].  The paper describing the detailed algorithm [2] is also available.  I already had a file with state outlines that I had obtained from Wikipedia.  A couple hundred lines of Perl later, and I had working code that would generate cartograms for the 48 contiguous states plus the District of Columbia, using Dr. Newman's programs to do the hardest part.  There's still a lot of details to attend to to make things a bit more general and more automatic, but at least I can play with maps.
The first map shown to the left is the basic undistorted map.  It's either a conic or equal-area projection of the continental US; the Wikipedia page doesn't say which.  That's not really important, as the two are essentially identical over this area.  The Wikipedia file has a couple of small errors in the outline descriptions that show up in certain drawings.  I corrected the worst one, but plan on obtaining different outlines at some point in the future anyway.

The next map is distorted so that the area of each state reflects its population.  The proportions are not perfect, but close.  The errors are probably due to my using too little padding around the map.  The Gastner and Newman paper discusses how much padding is appropriate, and I used less than they recommend.  In areas where the population density is roughly the same over adjacent states (eg, Illinois, Indiana, and Ohio) the shapes of the states are recognizable.  Where density changes drastically (eg, California) the shapes are more distorted.  This is the classic problem for cartograms -- how to adjust parameters so that things don't get distorted too badly.

The next map is distorted so that the area of each state represents the size of the federal land holdings within that state.  The same 11 states are shaded violet in this map and the preceding one.  It is one thing to read that most such holdings are in the West; the cartogram makes that painfully obvious.  The two violet areas can't be compared directly; the total areas of the 48 states in the two distorted maps don't match.  With some care, though, the diffusion algorithm should make it possible to set things up so that the maps can be compared.

Finally, just to show that I can do it, the basic red/blue map distorted by each state's electoral votes for the 2012 Presidential election.  Note how prominent Washington, DC becomes in this map, as it expands so its area is equal to that of the other states with three votes (eg, Wyoming and Vermont).  As always, there are lots of things that could be done with color shading to convey additional information.


[1] The programs depend on one external library for an implementation of the Fast Fourier Transform.  That library is also free, has been ported to many operating systems, and built just fine on my Mac.

[2] Michael T. Gastner and M. E. J. Newman (2004) Diffusion-based method for producing density equalizing maps, Proc. Natl. Acad. Sci. USA 101, 7499-7504.

Saturday, November 17, 2012

Western Donor and Recipient States

By far the most popular entry in this blog has been one I wrote about federal donor and recipient states: that is, whether states pay more or less in federal taxes than they receive in federal expenditures.  In the last ten days or so, there has been a sharp increase in the number of times that page is downloaded.  I suspect that is related to a prediction I made to a friend shortly before the election that if President Obama won, there would be an increase in the use of the terms "secession" and "revolution" from people in so-called red states [1].  The increase has certainly happened.  My opinion is that the increased interest in donor/recipient status is from people in blue areas researching the often-quoted factoid that blue states (and within states, areas) subsidize red states/areas.

That previous piece wasn't concerned with the red-blue differences, but with the difference between the 11 states from the Rockies to the Pacific compared to the rest of the country.  By the usual Tax Foundation measures, the western states clearly subsidize the rest of the country.  Inside that group, depending on exactly what you compare, five states are donors: California, Colorado, Nevada, Oregon, and Washington.  Of the remaining six, all but Arizona have populations below three million -- some way below three million -- so those don't have a large effect on the total.  In this piece, I want to discuss the thesis that even in the six western recipient states, why they are recipients doesn't necessarily match the conventional wisdom of poor, lazy, etc.

The first unconventional factor is the very large federal government land holdings in the West.  In each of the 11, between 30% and 85% of the state's area is owned by the feds.  This ownership results in assorted distortions.  Wyoming is an example.  Almost 40% of the coal mined in the US is produced in Wyoming.  Much of that production is from federal lands.  In states where large amounts of coal are produced from private land -- West Virginia, Kentucky -- the state levies substantial severance taxes.  Wyoming can't tax the federal government.  Instead, the federal government shares a portion of the royalty revenue it gets from the coal mining with the state.  In the federal flow of funds accounting, these royalty payments show up as federal expenditures.  Absent those large payments, Wyoming would be a donor rather than a recipient.

Another factor is the size of some of the operations conducted on federal land in the West.  There are a number of national laboratories and large military bases located in the West.  This type of operation is often even larger when compared to the size of the state population where they are located.  New Mexico is an example.  Sandia National Laboratory is located in New Mexico, and the Lab's $2B annual budget is counted as an expenditure in New Mexico.  The White Sands military reservation (including the White Sands missile range) has a similar budget.  Those two facilities alone account for about $2,000 in federal expenditures for every person who lives in New Mexico.  Idaho is another small state with a large national laboratory.

Finally, there are some demographic things that affect the outcome.  States like Arizona are home to a large number of retirees.  While a farmer works in Illinois, his Social Security and Medicare are taxes collected in Illinois.  When he retires to Arizona, his SS and Medicare payments are federal expenditures in Arizona.  Increasingly, he may also bring Medicaid money into Arizona as well, if he moves into a nursing home and is poor enough [2].  There doesn't appear (to me) to be any sane way to account for people retiring to states other than those where they paid their taxes while they were working (and where their children continue to work and pay taxes).  Absent some way to account for that situation, Southwestern (and Southern) states receive disproportionate social insurance payments but it's not their fault.

My point is that determining the donor/recipient status of a state in a meaningful way is harder than just comparing tax receipts and flow-of-funds expenditures.  In the West, it's harder than most places.


[1] There are very few all-red or all-blue states.  The real divide is a rural/urban thing, which becomes clear if you look at the red-blue maps done at the county level.  For example, Georgia may be red but Atlanta and adjacent counties are blue, at least in the last two Presidential elections.

[2] Almost 50% of Medicaid expenditures are now for long-term care, particularly for the low-income elderly.

Monday, November 12, 2012

IEA World Energy Outlook 2012

The International Energy Agency (IEA) released their World Energy Outlook 2012 report today.  The big news about this forecast is that they predict that by 2020 the US will be the largest oil producer in the world, and by 2030 the US will return to being a net oil exporter.  The last time the US was a net oil exporter was in the late 1940s.  I've written about the WEO report before.  I said then that I found the forecasts improbable.  I still do, and for the same reason.  The numbers in the report are generated from the IEA's World Energy Model (WEM).  In the documentation for that model we find:
The main exogenous assumptions concern economic growth, demographics, international fossil fuel prices and technological developments.... Demand for primary energy serves as input for the supply modules.

As a modeller myself, I've always complained bitterly about this structure.  In effect, it allows the people using the model to: (1) assume a politically acceptable level of growth; (2) work backwards to the supplies and prices of energy necessary to produce that growth; and (3) assign production levels to various sources in order to produce the necessary supplies and prices.  In past years the IEA assigned large amounts of supply growth to the OPEC countries.  Now that OPEC has suggested that they won't be providing large increases in production, the IEA forecasts that tight oil in the US (plus natural gas liquids) will provide the needed increases.  What's missing in this picture?  There should be a feedback loop that links primary energy production costs to supplies and prices.  Producing a million barrels per day over decades from tight US formations such as the Bakken require lots of new wells to be drilled essentially forever.  Money spent on drilling is not available to the rest of the economy.  Energy supplies and prices need to be a part of the economic model, not specified outside of it.

The graph to the left is an example of the kind of linkage that I'm talking about [1].  Since the 1970s, each time that US expenditures on crude oil have increased sharply at levels from above or reaching 4% of GDP, a recession has followed.  People have built models with energy as part of the economy for decades.  The model in Limits to Growth is probably the best-known of the group.  Ayres and Warr have published a considerable amount of work where the availability and cost of energy are a core part of the economic model.  Such models seem to yield pretty consistent results: without "then a miracle happens" technology intervention or new cheap sources of oil, we are living in the years where economic output peaks and begins to decline.

 Over the last few years, the IEA forecasts have shown a lot of change from year to year.  This year there's a big swing in oil (and near oil) production away from OPEC to the US.  Big swings don't give me much confidence in the underlying models.


[1]  Credit: Steven Kopits of Douglas-Westwood, testifying before the US House Subcommittee on Energy.

Friday, November 2, 2012

Election Day and Numbers

It's almost election day, so I feel obligated to write something political.

I'm a numbers guy -- always have been, probably always will be.  When politicians propose policy, I'm one of the people who demand that they show numbers that make at least some sense.  And when I want to know whether my candidates are doing well, I look at numbers: fund raising and polls in particular.  I'm also a pseudo-academic [1], so have been pleased that some real academics look at ways to combine multiple polls to give more accurate results.  The chart to the left is an example from the fivethirtyeight web site from earlier this month that shows estimated probabilities for Obama or Romney winning in the electoral college and in the popular vote.

Nate Silver and Sam Wang, have been under attack lately.  Silver and Wang are only two of the better known aggregators.  There are also sites like Votamatic and Real Clear Politics' No Toss-Up States.  All four of those show Obama with a high probability of winning the electoral college vote; not surprisingly, much of the criticism comes from Romney supporters.  One of the common complaints leveled at Nate Silver in particular is that he doesn't weight all polls evenly.  These attacks seem particularly partisan since unskewedpolls.com, a site that is openly partisan in favor of Romney, and that mangles the reported polling data to show that Romney will apparently win in a landslide, is not given the same treatment [2].

Another form of attack comes from the media.  The Washington Post's Chris Cillizza, whose Fix column rates various races, recently moved Ohio from "lean Obama" to "tossup".  Cillizza's reason for this?  "….the absolute necessity for Romney to win the state if he wants to be president - leads us to move it back to the 'tossup' category."  Not that the numbers have changed, but that Ohio has become the "must win" state for Romney, therefore it becomes a tossup.  Maybe Chris is right.  OTOH, I'm more inclined to the theory that Chris' job #1 is to sell newspapers and pull eyeballs to the Washington Post's web site.  That's a lot easier to do if it looks like a horse race.

And finally, there are attacks based on the hypothesis that polls can be wildly wrong because they don't reflect the secret behind-the-scenes things that only a political insider would know.  Certainly polls can be wrong.  They can be wrong even beyond the margin-of-error numbers that are always included in the press releases [3].  Polls were one of the reasons that the Chicago Tribune printed its "Dewey Defeats Truman" headline.  But the statisticians that design the polls continue to learn their craft and improve their skill.  For example, one hears that cell phones have made polling less inaccurate.  Yep, and you can bet that the poll designers were at the forefront of saying that, and then designing methods to account for the effect.

People like Nate Silver and Sam Wang, even though they may personally prefer an Obama victory, depend for their livelihoods on being accurate and unbiased.  Upton Sinclair famously said, "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"  It is difficult to get an academic statistician to bias his/her results when their reputation and future salary depend on being unbiased.  I've hung out with academics and former academics most of my life.  And based on that experience, I'm a firm believer that if you can bring real numbers that show Silver and Wang have flaws in their models, they'll be the first to admit it.  So far, the attackers aren't bringing numbers.



[1]  The way I define these things, real academics get PhDs and work for universities.  They teach, do research, speak at conferences, and publish in refereed journals.  I'm only a pseudo-academic.  I stopped at multiple Masters degrees and my research activities were within the confines of the old Bell System and its various derivative parts following the 1984 break-up.  I have occasionally spoken at conferences and did publish a paper in the refereed IEEE Spectrum, but the conferences were, and Spectrum is, aimed at practicing engineers as well as academics.

[2]  Nate attempts to weight polls based on several factors, including historical accuracy.  At least IMO, the manipulations done by unskewedpolls.com and some others lack the same sort of statistical justification that Nate considers.

[3]  Other factors: the Tribune's political insiders also predicted a Dewey win, and working around a year-long printers union strike forced the Tribune to go to press before there were any actual results available.

Thursday, November 1, 2012

Do the Math's "Star Trek Future" Survey

Over at Do the Math, Tom Murphy has an interesting piece about a semi-formal survey of physicists he conducted, and those physicists' opinions about the achievability of various advanced technologies and situations.  The physicists covered the gamut from undergraduate majors to grad students to full faculty members.  The results are discouraging for those who -- like me -- were promised flying cars when we were young.  The surveyed physicists saw self-driving cars being generally available within 50 years.  Everything else on the list -- fusion energy, lunar colonies, contact with aliens -- were "out there" tech or applications, and a lot of things -- artificial gravity, warp drive, teleportation -- were in the "not going to happen" category.

Tom provides lots of caveats.  He cheerfully admits that he's isn't a survey expert, and may have screwed up the structure of the questions.  The choices for time frames are quite broad: less than fifty years, more than 50 but less than 500, more than 500 but less than 5,000, and so forth.  The physicists put fusion energy into the second of those categories; both 75 years and 475 years in the future fall into that band.  He identifies an "expert gradient" pattern: the graduate students are more pessimistic than the undergrads, and the faculty members are even worse.  Tom even references -- in more polite terms -- the old saw that science advances one funeral at a time.

Like me, Tom thinks that our current high-tech society faces a number of difficult fundamental challenges in this century.  He takes a more global view than I do.  When he considers potential energy sources, for example, he often looks at global needs.  I admit to being a lot more parochial than that.  I think that there are big chunks of the world that have very little chance of maintaining their current population level and maintaining (or achieving) a high-tech society, so we need to be looking at regional solutions.  For example, India will have problems because of its large and growing population.  Africa will have problems because of the lack of existing infrastructure.  And so on.

The good news -- if you can call it that -- is that exotic new science doesn't appear to be necessary for some regions.  One of Tom's most interesting posts is the concluding one in his examination of alternate energy sources, an energy matrix that compares those sources on several measures (availability, potential size, etc).  Tom's conclusions?  Electricity is a solvable problem.  A relatively small number of technologies, most already in existence, will probably suffice.  And that transportation is a hard problem, as electrification is more difficult (compared to heating/cooling, lighting, etc).  I agree with those conclusions for some regions.

One of the regions that I worry about is the US's Eastern Interconnect.  Almost 70% of the US population lives in the states that make up that area; in 2010, those states were responsible for 72% of all US electricity generation; and also in 2010, 73% of that electricity came from coal and nuclear power plants (50% coal, 23% nuclear).  Over the next 25 years, the large majority of those nuclear plants will reach the end of their operating license extensions, and it seems unlikely (at least to me) that very many of them will be allowed to continue operation.  The Eastern Interconnect accounts for 80% of coal-fired generation in the US; any major reductions in coal use to address climate change and air pollution issues will fall very heavily on the Eastern Interconnect.  It's not clear to me where adequate supplies of electricity are going to come from.

Tuesday, October 23, 2012

Random thoughts on how people think

One of the sub-blogs at the League of Ordinary Gentlemen has a weekly "Monday Trivia" post.  Each day the question goes unanswered, the poser gives a hint, getting more specific as the week goes by.  I was the first to get the answer to this week's question/problem right, largely because the Tuesday-morning hint indirectly gave a lot of information.  If you're too lazy to go look, the question was basically "I'm thinking of three major league baseball teams linked together by a pattern in their geographical history.  What are the teams and what's the pattern?"

Patterns are something I've always been pretty good at, as have many of the people that I've worked with.  Other people don't seem to see patterns so easily.  Many years ago, I read a piece by a couple of psychologists that divided the population into two categories: people who see patterns, and people who don't.  They used this as a basis for explaining some of the problems that arose in management of technical people.  If a worker and her boss both fall into the same category, things work out pretty well; they're comfortable with each other.  If the worker and boss are of different types, though, you have one of two different problems.

In their argument, the psychologists said that technical people who don't see patterns learn to solve problems using cookbook-like procedures: try this, then try that, then try the other thing.  Given a good cookbook, many problems are tractable to such an approach.  Some problems aren't, though.  Pattern-seeing people, the psychologists went on, tend to make intuitive leaps to solutions that can't be found by using the cookbook; or at the least, finding the solution using the cookbook is time-consuming.

If the boss is a pattern-seer and the worker is not, both get frustrated.  The boss agonizes over the slowness of the worker, and the worker feels like the boss is pushing them to skip over the "standard" procedure with which they have always approached problems.  If the worker is a pattern-seer and the boss is not, the situation is somewhat different.  If the boss insists that the worker use the cookbook, the worker is frustrated (and in the psychologists' experience, usually leaves).  If the boss lets the worker use their pattern-seeing talent, the boss is often terrified.  To the boss, it looks like the worker has leaped off the edge of the cliff and, miraculously, landed safely on an answer.

My own views on US energy policy are that it's a problem that requires a pattern-seeing solution; there's not time to go through the cookbook, trying this and that and something else.  It's necessary to see all of the important trends and how they interact.  To recognize that some things are not possible, or at least not possible in the necessary time frames, and not waste time and money on them.  To recognize that some things are only possible in combination.  And most importantly, to recognize that most analysts over-constrain the problem.  We're going to have to answer the questions of what we will give up and what we will keep in reaching a workable high-tech future, because we can't keep it all.

That last paragraph is, of course, a total cop-out.  But it's too large a subject to do in a blog post -- it's a book-sized project.  I'm working on it.

Monday, October 22, 2012

Thoughts on the electoral college

As we approach election day, there has been the inevitable rash of pieces about the electoral college.  The majority of the ones that I've read suggest that the EC is an archaic compromise from the early days of the country that was necessary to get the Constitution ratified, but that there is no longer any need for it. (Some of the alignments on the different sides of that compromise were not as clear-cut as many history classes make it.  Georgia, for example, was a small-population state at the time but had large "western" land claims and anticipated rapid population growth, so backed proportional representation.)  I'll take the opposite side of that argument.  Not only are some of the initial reasons for the EC (and the Senate) still in place, but new ones have emerged as well.

The types of agriculture that are emphasized in federal policy are regionally concentrated in the Great Plains, the Midwest, and the South.  Most of current federal ag policies originated in the 1930s as part of an overall package intended to keep rural states from falling into a permanent second-class economic status.  The package was created largely with the support of the more urban coastal states.  Without the leverage provided by the Senate and EC, rural states have (at least in their own minds) reasons to fear that they would be put back on the path to being second-class.

For the first 120 years or so of US history, one of the federal government's priorities was to transfer its public land holdings to private or state hands (see state land trusts and various homestead acts).  That policy was changed around 1900, and the federal government decided to retain its holdings (a policy change that was eventually stated formally in the Federal Land Policy and Management Act of 1976).  Today, the large federal land holdings are heavily concentrated in the West.  States in which a large portion of their area is held by the federal government have a long history of distrusting the federal government's management of those areas (with some historical justification).  In large part due to the leverage that the Senate and EC give those states, they have been able to insist for the last half-century that the Secretary of the Interior, who oversees most of the federal lands, be from a western state*.

Federal policy regarding regulation of coal-fired electricity generation provides yet another example.  Ten states sued the federal Environmental Protection Agency a few years back in a (so far successful) attempt to force the EPA to regulate CO2.  Of the ten, coal makes up a small part of electricity generation in eight.  In a separate lawsuit a few years later, 14 states sued the EPA over its newest rules regarding particulate/NOx/SOx restrictions.  In most of the 14, coal provides a large share of their generating capacity.  States that are heavy coal users for historical reasons fear that other states will be able to force them into making expensive changes in their infrastructure.

The original Senate/EC compromise dealt with the fears of states with small- or medium-sized populations that federal policy would be determined by a small number of heavily-populated states.  There continue to be issues where that same fear exists today.  It is highly improbable that the US Constitution will be amended to get rid of by-state representation.  The EC may be circumvented by state statutes such as the National Popular Vote Interstate Compact, but so long as amendments are voted on by states, the EC isn't going to be abolished.


* This may change.  Following the Deepwater Horizon oil spill in 2010, the question was asked, "What does a westerner know about deep water oil and gas production on the federal lands in the Gulf of Mexico?"  If there is a perception that the important federal holdings are the offshore Atlantic and Gulf regions, the western states may lose their leverage.

Friday, October 12, 2012

Rail-Oriented Transit in the West

Mass transit in general, and rail-oriented transit in particular, is regarded as one of the logical responses to steadily increasing costs for liquid fuels.  At least in the US, building new rail-oriented transit is usually a long drawn-out process.  A question that might be worth asking, then, is which parts of the country "have a leg up" on that process; that is, have made a significant start at building out a rail-oriented transit system.

This chart shows the rail-oriented transit systems in the US with the largest daily riderships.  The numbers are taken from Wikipedia pages on light- and heavy-rail transit systems.  The cities listed are the largest in the area served by each system; eg, San Francisco's heavy-rail system extends well down the peninsula to other smaller cities.  I arbitrarily cut the list off at 40,000 riders.  Several of the systems are located in the BosWash corridor, which is not surprising.  Chicago's famous "L" is also high on the list.  The other twelve cities on the list surprised me in one way: eight of them are in the West (defined from the Rocky Mountains to the Pacific coast) and only four of them are in the rest of the country.

The West has a small number of  metropolitan areas with significant populations: Seattle, Portland, the San Francisco Bay area, LA/San Diego, Las Vegas, Phoenix, the Front Range of Colorado, and Salt Lake City.  All of those, with the exception of Las Vegas and Seattle, are on the list.  Seattle fails to make the list because its system has ridership just over 30,000.  Las Vegas's monorail is even smaller, and while there have been proposals to expand the system in either monorail or light rail format, none of those have gone very far.  Some of the Western systems are being expanded.  Denver, for example, has multiple additional lines already under construction that will go into service between 2013 and 2016, that will substantially increase ridership.

The non-West has many more metropolitan areas on the scale of those like Portland or Denver than the West has.  Why aren't there large numbers of light-rail systems growing at the same rate as those in the West?  There are many possible explanations, but I'll put forward my own: growth in the West has taught those cities a lesson that much of the non-West hasn't learned.  From 1990 to 2010, Alabama was a rapidly growing non-Western state; over that same interval, though, it grew at only half the rate of Oregon.  Cities in Ohio, which grew at less than half the rate of Alabama over the same 20 years, would be even farther from proper lesson: at some point, more lane-miles aren't the answer.

Friday, October 5, 2012

It Ain't Just the Price of Crude

One of the lead stories in Friday's LA Times documents the recent rapid increases in gasoline prices, going above five dollars per gallon in a handful of locations.  While the most recent prices are not yet reflected, the map of gasoline prices in the US shows two regions with significantly higher prices: the West Coast and the Northeast.  While some may claim that these numbers reflect variations in factors such as state gasoline taxes, the reality is quite a bit simpler.

In the last few weeks, a number of refineries in California have been shut down due to accidents or the need for routine maintenance.  Stories have correctly pointed out that the West Coast is, for the most part, not connected to the the large network of pipelines that connect various parts of the country farther east.  Earlier this year, multiple refineries serving New England and the New York/New Jersey Atlantic Coast area closed.  Stories at the time pointed out that pipelines that could deliver product from Gulf Coast or Midwest refineries were already running at capacity.  Shipping gasoline by tanker is relatively inexpensive.  Enough so, that once a Gulf Coast refinery has loaded the tanker, shipping the product to Europe may be more profitable than shipping to the East Cost.

California seems to be the place where we get big demonstrations about what can happen when we don't pay enough attention to the intermediate steps between the raw material and the end consumers of finished energy products.  California's deregulation (well, reregulation would be more accurate) of electricity supplies gave them the electricity crisis of 2000 and 2001.  Some of the key factors in that fiasco involved shut down of generating capacity to create shortages in the finished product.  Enron and others also manipulated limits in the transmission system -- the electrical equivalent of the oil pipeline network -- in order to create shortages and increase prices.

Energy policy can't stop with the production of the raw materials that go into the system.  It has to address the entire system, from well to pump for gasoline, from mine or wind farm to the meter on the outside of your house for electricity.

Friday, September 28, 2012

Family Archival Storage

One of the things my paternal grandmother left -- in addition to a marvelous music box -- was an overstuffed three-ring binder full of notes about the family tree.  I've finished scanning all 705 pages that she left.  Most of the pages are either pencil-on-paper or typed with a fabric ribbon; even the ones that are approaching 50 years old are in pretty good shape.  There are a few pictures included that go back much farther that are also well preserved.  The old black-and-white photos are, in fact, in better shape than the few newer color photos.  Copies of the scanned pages are being distributed to several family members to reduce the risk of losing everything in a single fire/tornado/other disaster here.

I used to have lunchtime discussions about archival storage, intentional or otherwise, with the woman who managed the company's technical library.  She maintained that we have a much better idea of what the common man thought and wrote about the US Civil War than people living 100 years from now will have about what we think today.  Her fundamental reason was that during the Civil War, people left their thoughts on media with an inherently long shelf life: silver-based black-and-white photography and pigment-based inks on relatively low-acid paper.  Letters or a diary written on such material can be tucked away in a trunk in the attic and still last for a long time.

Today, of course, we send e-mail and post stuff on Facebook and tweet.  Most of which never reaches paper, and all of which is subject to the vagaries of computers, both our own and those "in the cloud."  I have CD-Rs that are approaching 20 years old and seem to work fine on those occasions when I need something from one of them, but digital media are often an all-or-nothing proposition: they work perfectly until they fail, but once they fail they're unusable.  Then there are file system formats, and formats for the file contents themselves.  In 100 years, even if the bits on a CD-R are still good, will there be a drive that can read it, or software that understands the ISO file system, or applications that still handle JPEG image files?

Even if you assume conscientious descendents that periodically copy the material from the old physical medium to a new one, and transcode images from one format to another, there's a more subtle problem.  Serial encoding of images with lossy algorithms (eg, JPEG as it is almost always used) results in steady deterioration.  The first encoding introduces small errors in the reconstructed image.  Many of these errors produce visible artifacts in the image if you know what you're looking for.  The image to the left illustrates ringing errors in an MRI image (in this case, visible "echos" of the sharp dark/light boundary) [1].  The serial encoding problem occurs because a future encoding with a different algorithm will waste bits trying to accurately reproduce the artifacts.

I find myself struggling to find some sort of middle road, one that allows me to take advantage of the ability of contemporary tech to impose indexing and organization on Grandma's work and additions to it, and at the same time, to allow both Grandma's stuff and any material I add to survive something as extreme as skipping a generation along the way.


[1] Image taken from the American Journal of Roentgenology, An Introduction to the Fourier Transform.

Wednesday, September 12, 2012

Food Prices and Political Crises

The New England Complex Systems Institute has updated its briefing on the relationship between food crises and political crises, with particular attention paid to the region of the Middle East and North Africa.  The bottom line, overly-simplified is this: when global food prices go too high, the MENA countries experience an increase in the frequency of political unrest.  All of the MENA countries are net importers of food calories, much of it in the form of bulk grain.  Several MENA countries import more than a million metric tons of corn, rice, and wheat per year: Algeria, Egypt, Iran, Iraq, Israel, Morocco, Saudi Arabia, Syria, Turkey, and Yemen.  Egypt alone imports over 15 million tons, more than 400 pounds per person.


The NECSI results are summarized in this figure.  The black line shows nominal food prices; the vertical red lines are incidents of political unrest, many of them concentrated in the "Arab Spring" of 2011.  The paper develops a model and a price threshold at which political unrest is likely to occur.  Based on current predictions of price increases (US corn prices reached record highs in August, but have retreated somewhat since then), the model predicts that another round of incidents will begin in the region in October, 2012.

What's driving the price increases?  One large factor is the drought that spread across much of the US grain-growing regions this year (weekly maps showing the extent can be found here).  The higher prices reflected a belief that US grain output would be significantly smaller (although today the USDA announced that the corn crop may not be quite as bad as has been previously feared).  A second factor is the use of corn as a feedstock for producing ethanol for use as a transportation fuel in the US.  The US has a renewable fuel standard that requires an increasing total amount of ethanol to be blended into gasoline supplies.  While the overall situation regarding ethanol is complex [1], the ethanol industry will still purchase large amounts of the corn crop this year.

The NECSI work provides a nice formal mechanism quantifying something that I find myself saying too often: "Drought in the US means that Africa starves."  To a lesser extent, you can replace the US in that statement with Argentina or Australia.  The reduced US grain crop follows relatively poor harvests in both Argentina and Australia, two other major grain exporters.  You can also replace drought with increased domestic demand and get the same result.  US ethanol mandates may be fine domestic policy -- although there are lots of analysts who disagree with that -- but terrible global policy.  And not just in terms of millions of starving people.  The MENA countries produce and export a lot of oil.  Political unrest generally disrupts activities like oil production (oil pipelines are easy targets for dissidents).  High prices for global grain exports could conceivably lead to higher prices for global oil exports.

Aren't complex dynamic systems, filled with feedback loops and time delays, fun?


[1] The penalties for failure to meet renewable fuel standard requirements are fines; companies may choose to pay some fines rather than very high prices for ethanol.  In previous years the oil industry blended more ethanol than required, accumulating credits that could be used to offset a shortfall and avoid fines this year.  Blended ethanol raises the octane of the final gasoline product; there are minimum octane requirements; the alternatives to ethanol may be more expensive, which would cause the oil industry to continue to use ethanol even at higher prices.  Based on the years I spent working for large corporations, I can guarantee that someone -- probably many someones -- is busily trying to solve an optimization problem that minimizes the costs of all the factors in aggregate.

Friday, September 7, 2012

Saudi Arabia, Oil Importer?

One of the frequently cited pieces of oil financial analysis the past few days is a Citigroup report suggesting that Saudi Arabia could become an oil importer by 2030 (eg, Bloomberg's story on the report).  The notion that the Saudis would be consuming all of their production, and then some, is based on extrapolating several trends.  The two fundamental trends are illustrated in the graph below (from Mazama Science's oil export database visualization tool): relatively flat production over the last two decades and steadily increasing internal consumption.

There are a number of reasons to believe the direction of the basic trends will continue.  Books have been written on when the Saudis will reach terminal decline in their production; Simmons' Twilight in the Desert is probably the best known, as well as being one of the most pessimistic.  On the demand side, the Saudis generate almost half of their electricity using oil (a practice the developed countries of the world gave up in the 1970s during the two oil crises); the country's population is young and growing rapidly; and domestic prices for oil products are set well below global market prices.  All of those factors point to growing domestic demand.

The best argument against the notion of the Saudis becoming net oil importers is the enormous role that revenues from oil exports play in the country.  75% of the government's revenues are from oil, and that money pays for many programs including defense, infrastructure development, education, and health care.  This is not spending that the government can simply turn off without serious consequences.  In the absence of increased production, the government needs to maintain a substantial level of oil exports, or rapidly build the non-oil economy, or both.  An additional problem the country faces is the make-up of its work force: a staggering 80% of workers are foreign nationals, not Saudis.

The Saudis appear to be taking some steps to delay the decline in oil-based revenues.  They have signed nuclear deals with all of France, China, Argentina and South Korea.  Commercial reactors would be used in place of oil to generate electricity and desalinate sea water.  The government is also considering construction of a large amounts of solar generating capacity for the same reason.  And for many years the Saudis have been working to move "up the value chain" in order to get more revenue from their petroleum: selling plastics derived from petroleum pays better than selling the petroleum.  Similarly, selling refined products yields more revenue than selling crude oil, and the Saudis have invested in refining capacity at home and overseas.  "Saudiization" of the work force is an official government policy, which might, in theory at least, reduce the population and domestic demand.

Can the Saudis pull all of this off and continue to export at current levels in 2030?  My own opinion is "not a chance."  Will their exports go to zero by 2030?  My thinking is that that also is not going to happen, although the exports may be plastics, chemicals, and finished petroleum products rather than crude oil.  I also think the path to that point is going to be erratic and possibly violent.  A lot has to change, and 23 years isn't a lot of time to accomplish those changes.

Tuesday, September 4, 2012

A Disappointing Study of US Renewable Electricity Potential

Back in July the National Renewable Energy Labs (NREL) published a report titled U.S. Renewable Energy Technical Potentials: A GIS-Based Analysis.  Graphical Information System (GIS) data has become really popular.  Thanks to satellites and the ability to handle massive amounts of image data, the world today is better mapped than previous generations could even dream about.  Using available GIS data, the authors evaluated the potential for each of several different renewable sources of electricity in the US.  Consider one resource -- rural utility-scale photovoltaic panels.  The authors identified all of the land which met certain criteria (rural, sufficiently flat, not water, not in a national park, etc) and calculated how much power could be generated using today's PV technology if all of the identified land were used for that purpose.

 The table to the left summarizes the results from the power for ten different renewable resources.  The potential is shown in petawatt-hours; in 2010, total US retail sales of electricity were about 3.7 PWh.  The total figure is not meaningful.  The land considered suitable for concentrated solar and rural utility PV is the same land, and it can't be used for both.  Even some of the individual numbers aren't meaningful.  Much of the land suitable for rural utility-scale PV is also suitable for growing food crops, an application that can't be realistically ignored.  Most renewable resources are intermittent on some time scale.  Rural utility-scale PV may be able to provide many times the US current power consumption, but without some form of storage, it still won't keep the lights on at night. 


The authors presented their results at the state level and used the data to produce maps like the one shown to the right.  This particular map shows the potential for hydrothermal power, which is concentrated in the western states.  But state level aggregation doesn't seem particularly helpful either.  Western states are large, so energy sources can still be far from the population centers (as well as separated by the odd mountain range here and there).  Western states vary enormously in terms of their population.  California and Nevada may be in the same category in terms of hydrothermal potential; but because of the differences in their populations, Nevada's resource may be sufficient to meet all of Nevada's needs, while the same resource in California can provide only a small fraction of what is needed.

I'm seriously disappointed by the study, which I think adds very little value.  We already knew that rural utility-scale PV could potentially produce far more power than we currently consume.  I can think of a half-dozen things to do with the data that would have been much more useful.  For example, for some (or all) of the 50 largest metro areas, how far and in what pattern would rural PV need to be deployed to meet the metro areas' power needs?  Avoid mountain ranges; avoid areas that are heavily forested; avoid existing towns.  Where do the patterns overlap?  What happens if current crop land is excluded?  There are enormous amounts of GIS data of various types available, and some of the ways that it can be used would be valuable.  The results in this paper aren't one of those, unfortunately.

Tuesday, August 28, 2012

What I Don't Like About My E-Reader

For the last few years I've had one of Barnes & Noble's nook e-readers.  I've bought a variety of books to read on it, and borrowed e-books from the library.  But I have a problem with it.  It's not one of the normal problems that I hear people complain about.  Yes, the contrast leaves something to be desired, particularly in low light.  Yes, the power connection is in an awkward place if you're putting the reader in some sort of a rest while charging it.  Yes, I've dropped it and put a small ding on the screen.  Yes, the screen is too small to deal well with technical books.  Yes, the user interface for organizing books and other files could be better.  But those aren't what I want to complain about today.

The problem I have with my e-reader is that it tempts me to do illegal things.  Once you start poking around on the Internet, you quickly discover that there's an enormous volume of pirated print material, particularly what I'll just call "geek fiction."  All of those old books have been scanned and run through some sort of character recognition software in order to produce computer-readable versions.  New books of interest show up fairly quickly.  If there's an e-book version of the new book, unencrypted copies of that also show up quickly, avoiding the character recognition errors to which the scanned copies are subject (for example, the character pair "cl" is often recognized as the single character "d").  TTBOMK, all of the encryption schemes used by the publishers and distributors have been broken [1].

 The e-reader has also tempted me to begin design of a rapid book scanner based on a digital camera, and to look at the software chain necessary to convert images of the books in my personal collection into EPUB format.  I'm not the first to do so, of course.  There's a whole community of people dedicated to do-it-yourself book scanning.  And commercial scanners up to and including robotic scanners that automatically turn the pages.  I'm not getting any younger, and the day will come when I want to live somewhere smaller.  When that happens, many of the bookcases and their contents are going to have to be left behind.  But I don't want to give up a lifetime's worth of tucking away books that I've purchased.

There are a variety of reasons why people would want to make a copy of a print book if it were quick and easy to do.  Backup copies are an obvious one; if my house burns down, I lose hundreds of books that would be hard to replace.  My favorite reason came from a student I spoke with at the University of Denver.  At the beginning of each quarter, he spent a Saturday or Sunday afternoon while football or another sport was on TV scanning all of his new textbooks.  He didn't cheat in the sense of scanning the book and then returning them for a refund; he kept the dead-tree version of the books.  He didn't do character recognition either, just took a picture of each page and stored it as a fairly high-resolution image on his laptop.  Then, when one of his professors said, "Now, if you'll all turn to page 257 in Wilson," he's got a copy with him.  A few keystrokes pulls it up.  Meanwhile, the rest of the class is looking desperately through their backpack to see if somehow (by accident) they've brought that book to class.

I know I'm going to build the scanner and acquire the software -- there's absolutely no way that I'm going to resist the temptation [2].  I just wish that the writers and publishers would work out an archive and pricing scheme so that I didn't have to.  Here's my advice to them, particularly for out-of-print books.  It's out of print.  You're not going to make a dime from it unless a miracle happens and enough demand materializes to make it worth a press run.  The cost to store an e-copy is trivial, as is the bandwidth to deliver it.  Make it available somewhere easy to find for a dollar or two.  At that price, I'll download a clean e-book rather than any of (1) scan it myself if I have it, (2) wait weeks for it to show up from my local library's network and scan it, or (3) find someone else's scanned copy online.  And particularly if the author makes it available, s/he makes almost a dollar that they wouldn't have otherwise.


[1] And if not, they will be soon.  This is a technology battle that the publishers can't win.  Each has to pick a scheme.  That scheme has to run on consumer electronics that has a very slow turnover, so they're stuck with the scheme.  I have argued for years that in that situation, the encryption will be broken in a general sense -- that is, there will be software available for personal computers that will decrypt everything encrypted with a particular scheme, not just brute-force cracking of a particular item.  To paraphrase Napolean, "I generally find that God fights on the side of the heavier artillery," and the hackers' artillery is by far the heavier.

[2] If it comes down to it, I'm competent to write the necessary software myself.  Suitable algorithms for each step have been published, and I spent years coding up other people's algorithms from time to time in order to evaluate performance.