One of the things that I anticipated would happen, when the Republicans regained control of the US Senate, was that movement conservatives would start talking about opening the Yucca Mountain nuclear waste repository. I thought it would be somewhere down the line, after they had dealt with the PPACA, tax rates, and (in the energy realm) CO2 regulation. Silly me. In today's George Will column in the Washington Post, George has it on his list of six things the Republicans should do very first thing.
I know I'm not entirely rational on the subject. For many years, though, it has struck me as odd that there were a bunch of people making arguments that spent fuel is safe enough to store less than 100 miles upwind of a major metropolitan area in the West, but it's not safe enough to store in the states where it's produced. That the dry storage casks are so tough that nothing can happen if they fall off of a truck or train in a Western state, but not tough enough to be safe to store in the states where the waste is produced. And politically, pundits who oppose the idea that the federal government can ever force anything on individual states seem to be just fine with forcing transport and storage of spent nuclear fuel on states that (a) don't want the stuff and (b) mostly don't have reactors of their own producing waste.
I've said it before, but the federal government's record is not encouraging. Ask people near Hanford, Savannah River, or the former Rocky Flats how they feel about it. The first radioactive waste arrived at the Waste Isolation Pilot Plant in New Mexico in 1999, which is supposed to be safe for at least 10,000 years. This year, americium and plutonium particles were detected above ground a half-mile away from the facility. Granted, the containers from which radioactive material leaked were much less robust than dry casks in which spent fuel is stored. OTOH, I'm inclined to agree with Ryan Flynn, New Mexico Environment Secretary, at a news conference after the WIPP incident: "Events like this simply should never occur. From the
state's perspective, one event is far too many."
Thursday, November 6, 2014
Tuesday, October 28, 2014
A Different State-Splitting Trend?
In 2013 there were a number of proposals put forward to create new states by splitting existing ones. The most widely covered ones have been Colorado's 51st State movement, the State of Jefferson movement in far northern California and southern Oregon, and the movement in western Maryland. One thing that all of these have in common is that they represent more rural portions of the states in question seeking to "escape" from the urban parts of the states. Certainly in the cases of California and Colorado, the escape would come with a hefty price tag -- the rural areas that want to leave are subsidized financially by the urban areas.
This year there have been two new state-splitting proposals with the opposite perspective: urban areas seeking to be free from the rural areas. The first of those was the ballot initiative in California that would seek to split the state into six parts. Funding for signature collection came from a Silicon Valley venture capitalist who reportedly believes tax rates in his new state would would be substantially lower because they wouldn't have to subsidize the Central Valley and rural northern California. Too few signatures were collected for the initiative to make this November's ballot. The second is a proposal to split off southern Florida, so the urban areas there could begin taking steps to deal with climate change consequences like rising sea levels.
The proposed split of Florida is shown in the upper portion of the figure to the left. Things are in an early stage. South Miami's city commission passed a resolution calling for the separation and will now send the proposal to the 24 counties that would be South Florida. As is true for so many of this type of proposal, this one probably won't go anywhere. The hurdles for actually splitting a state are pretty darned high. Among other things, Congress would have to bless it, and the balance of power between political parties, particularly in the Senate, becomes a major consideration.
Why do I say this is a movement by an urban part of the state to flee the rural part? The lower part of the figure is a cartogram with counties resized to represent population. The green line shows the proposed division between north and south. The red/blue coloring is the standard one for election cartograms and shows whether the county voted majority-Obama (blue) or majority-Romney (red) in 2012. The three biggest urban areas in the state -- Miami, Tampa, Orlando -- would be in South Florida.
This year there have been two new state-splitting proposals with the opposite perspective: urban areas seeking to be free from the rural areas. The first of those was the ballot initiative in California that would seek to split the state into six parts. Funding for signature collection came from a Silicon Valley venture capitalist who reportedly believes tax rates in his new state would would be substantially lower because they wouldn't have to subsidize the Central Valley and rural northern California. Too few signatures were collected for the initiative to make this November's ballot. The second is a proposal to split off southern Florida, so the urban areas there could begin taking steps to deal with climate change consequences like rising sea levels.
The proposed split of Florida is shown in the upper portion of the figure to the left. Things are in an early stage. South Miami's city commission passed a resolution calling for the separation and will now send the proposal to the 24 counties that would be South Florida. As is true for so many of this type of proposal, this one probably won't go anywhere. The hurdles for actually splitting a state are pretty darned high. Among other things, Congress would have to bless it, and the balance of power between political parties, particularly in the Senate, becomes a major consideration.
Why do I say this is a movement by an urban part of the state to flee the rural part? The lower part of the figure is a cartogram with counties resized to represent population. The green line shows the proposed division between north and south. The red/blue coloring is the standard one for election cartograms and shows whether the county voted majority-Obama (blue) or majority-Romney (red) in 2012. The three biggest urban areas in the state -- Miami, Tampa, Orlando -- would be in South Florida.
Thursday, September 18, 2014
Write Like the Wind
Today, in my e-mail, was a request for a political contribution. That's not unusual, but today, to sweeten the pot, I was offered the chance to win a dinner with George R. R. Martin, author of the Game of Thrones series. I don't dare send them any money; I might win. Then I would find it necessary to play the video below at the meal, because neither George nor I are getting any younger and it would be convenient if he finished the damned series before one of us dies.
Monday, August 18, 2014
It Takes More Than Turbines
The Telegraph ran a brief story earlier this month about UK regulators paying wind farms in Scotland nearly £3.0M to not generate so much electricity for one day. High winds due to the remnants of Hurricane Bertha coincided with a period with low local demand for electricity, and the grid lacked capacity to carry the available power elsewhere. Such situations are neither new, nor confined to the UK: in both 2011 and 2012, the Bonneville Power Administration ordered wind generators in Oregon shut down to avoid oversupply problems. The problems in both cases are not that the grid isn't smart enough to handle the intermittent nature of wind; the problem is that the grid lacks sheer bulk transport capability to move the excess wind power to someplace it could be used.
The map to the left is a portion of a graphic from an NPR story about the US power grid, showing the US portion of the Western Interconnect. The bold orange lines represent a proposal by the American Wind Energy Association for an overlay high-level grid that would allow full use of wind when it is available in a geographically diverse set of sites. This overlay grid passes "close" [1] to all of the major population centers in the West, as well as the best of the wind resources.
Building a reliable grid from intermittent renewable sources requires not just geographic diversity, but source diversity as well. The AWEA overlay also happens to pass close to excellent solar resources in the desert Southwest, undeveloped hydro resources in the Northwest, geothermal resources in the Great Basin, and sites suitable for pumped hydro energy storage. Balancing all of those resources against demand across eleven states (plus western Canada and a bit of Mexico) is a complex but doable task, given enough bulk transport. Speaking broadly, the situation in Oregon with too much supply and not enough demand shouldn't happen. If there really are no consumers for it, it ought to be pumping water uphill against future need.
The AWEA isn't the only group that draws proposals for overlay super grids [2]. In the Western Interconnect, they all tend to look similar. As I've noted in other posts, geography plays a big role. The people are concentrated in a small number of areas; the easy routes for transportation or transmission are few and obvious; the energy resources are where they are, many of them either close to one or more demand centers, or along one of the routes between those demand centers. This is a good part of the reason that people can draw up nuts-and-bolts sorts of plans for a heavily-renewable power grid in the Western Interconnect. The other parts of the country present a much more difficult challenge. And as always, I raise the question of whether those other parts of the country will demand a single national energy policy that makes the West's efforts difficult or impossible.
[1] "Close" in the West can be rather different than close in other parts of the country. As in, "it's only a hundred miles", or "it only has to cross one mountain range".
[2] The AWEA map does get used a lot, though. Part of that is probably that it shows up in Wikipedia's Wikimedia Commons, unencumbered by copyright. An interesting feature of the full national map is that it doesn't show any additions in the Southeast part of the country, where wind resources are rather poor.
The map to the left is a portion of a graphic from an NPR story about the US power grid, showing the US portion of the Western Interconnect. The bold orange lines represent a proposal by the American Wind Energy Association for an overlay high-level grid that would allow full use of wind when it is available in a geographically diverse set of sites. This overlay grid passes "close" [1] to all of the major population centers in the West, as well as the best of the wind resources.
Building a reliable grid from intermittent renewable sources requires not just geographic diversity, but source diversity as well. The AWEA overlay also happens to pass close to excellent solar resources in the desert Southwest, undeveloped hydro resources in the Northwest, geothermal resources in the Great Basin, and sites suitable for pumped hydro energy storage. Balancing all of those resources against demand across eleven states (plus western Canada and a bit of Mexico) is a complex but doable task, given enough bulk transport. Speaking broadly, the situation in Oregon with too much supply and not enough demand shouldn't happen. If there really are no consumers for it, it ought to be pumping water uphill against future need.
The AWEA isn't the only group that draws proposals for overlay super grids [2]. In the Western Interconnect, they all tend to look similar. As I've noted in other posts, geography plays a big role. The people are concentrated in a small number of areas; the easy routes for transportation or transmission are few and obvious; the energy resources are where they are, many of them either close to one or more demand centers, or along one of the routes between those demand centers. This is a good part of the reason that people can draw up nuts-and-bolts sorts of plans for a heavily-renewable power grid in the Western Interconnect. The other parts of the country present a much more difficult challenge. And as always, I raise the question of whether those other parts of the country will demand a single national energy policy that makes the West's efforts difficult or impossible.
[1] "Close" in the West can be rather different than close in other parts of the country. As in, "it's only a hundred miles", or "it only has to cross one mountain range".
[2] The AWEA map does get used a lot, though. Part of that is probably that it shows up in Wikipedia's Wikimedia Commons, unencumbered by copyright. An interesting feature of the full national map is that it doesn't show any additions in the Southeast part of the country, where wind resources are rather poor.
Sunday, August 3, 2014
TABOR Conspiracy Theory
I let myself believe in conspiracy theories on alternate Tuesdays; that'll be relevent in a moment.
TABOR -- the Colorado Taxpayers Bill of Rights -- added a section to the Colorado state constitution in 1992 that, among other things, said the state legislature couldn't pass tax increases, they could only refer such measures to the voters. In 2011, a group of state legislators sued the State of Colorado, asserting that setting tax rates is a fundamental job of government, and that by denying the state legislature that power, the state constitution was in violation of the Guarantee Clause of the US Constitution. If it's been a while since you read the Constitution, that's the sentence which requires every state to have "a republican form of government."
The conventional wisdom at the time was that the suit would be short-lived. To borrow from the defense's summary, the federal court would promptly find that this fell under the Supreme Court's political-question doctrine, that the plaintiffs lacked standing, and that the case would be dismissed. Things have not, so far, worked out that way. The district court judge ruled that the plaintiffs did have standing; this case differed from the political-question precedents; and the arguments should be made at trial rather than in a preliminary hearing. The defense appealed.
A three-judge panel of the federal appeals court, arguing de novo, ruled the same way: sufficiently different from the precedents, plaintiffs had standing, and ordered the case returned to district court for trial. The defense appealed to the entire court for a rehearing. This past Tuesday, I read that the full appeals court declined to rehear the case, thereby affirming the ruling by the three-judge panel. Presumably, the defense will now appeal to the Supreme Court, so it will be at least a few months before anything interesting can happen. The quickest would be if the Supremes declined to hear the appeal, in which case the district court would schedule a trial date; the wheels of justice do grind slowly.
My impression is that the defense has been poorly prepared throughout. That they were assuming that "political question, no standing, case dismissed" alone would carry the day. When I read about the latest development, I was struck by the possibility that this was intentional and not just a matter of overconfidence. Both parties have found themselves tied in knots by TABOR when they controlled the legislature following a recession: the Republicans following the 2000-01 recession, and the Democrats following the one in 2007-09. The single-subject amendment to the Colorado constitution in 1994 makes it unlikely that TABOR could be removed in its entirety; it would have to be cleared a bit at a time (some experts think as many as 20 separate amendments would be necessary under the current rules). "What," I thought on Tuesday, "if the secret plan is to lose the TABOR lawsuit and let the federal courts accomplish what the politicians can't?"
Nah, that's too crazy even for me on a Tuesday.
TABOR -- the Colorado Taxpayers Bill of Rights -- added a section to the Colorado state constitution in 1992 that, among other things, said the state legislature couldn't pass tax increases, they could only refer such measures to the voters. In 2011, a group of state legislators sued the State of Colorado, asserting that setting tax rates is a fundamental job of government, and that by denying the state legislature that power, the state constitution was in violation of the Guarantee Clause of the US Constitution. If it's been a while since you read the Constitution, that's the sentence which requires every state to have "a republican form of government."
The conventional wisdom at the time was that the suit would be short-lived. To borrow from the defense's summary, the federal court would promptly find that this fell under the Supreme Court's political-question doctrine, that the plaintiffs lacked standing, and that the case would be dismissed. Things have not, so far, worked out that way. The district court judge ruled that the plaintiffs did have standing; this case differed from the political-question precedents; and the arguments should be made at trial rather than in a preliminary hearing. The defense appealed.
A three-judge panel of the federal appeals court, arguing de novo, ruled the same way: sufficiently different from the precedents, plaintiffs had standing, and ordered the case returned to district court for trial. The defense appealed to the entire court for a rehearing. This past Tuesday, I read that the full appeals court declined to rehear the case, thereby affirming the ruling by the three-judge panel. Presumably, the defense will now appeal to the Supreme Court, so it will be at least a few months before anything interesting can happen. The quickest would be if the Supremes declined to hear the appeal, in which case the district court would schedule a trial date; the wheels of justice do grind slowly.
My impression is that the defense has been poorly prepared throughout. That they were assuming that "political question, no standing, case dismissed" alone would carry the day. When I read about the latest development, I was struck by the possibility that this was intentional and not just a matter of overconfidence. Both parties have found themselves tied in knots by TABOR when they controlled the legislature following a recession: the Republicans following the 2000-01 recession, and the Democrats following the one in 2007-09. The single-subject amendment to the Colorado constitution in 1994 makes it unlikely that TABOR could be removed in its entirety; it would have to be cleared a bit at a time (some experts think as many as 20 separate amendments would be necessary under the current rules). "What," I thought on Tuesday, "if the secret plan is to lose the TABOR lawsuit and let the federal courts accomplish what the politicians can't?"
Nah, that's too crazy even for me on a Tuesday.
Monday, July 21, 2014
Migration
A while back at Ordinary Times, there was an interesting comment thread on the subject of defining the Midwest region of the US. One of the thoughts that occurred to me while reading that was whether it was possible to define regions based on inter-state migration patterns. The idea grew, I suppose, out of my own experience. I lived and worked in New Jersey for ten years, but never really felt like I fit in there. Eventually my wife and I moved to Colorado, to the suburbs of Denver, where we immediately felt right at home. Most people, I thought, might have been brighter than we were and not moved to someplace so "different."
I've also encountered a variety of nifty data visualization tools that look at inter-state migration in the US, like this one and this one from Forbes. State-level data for recent years turns out to be readily available from the Census Bureau. We can define a simple distance measure: two states are close if a relatively large fraction of the population of each moves between them each year. "Relatively" because states with large population have large absolute migration numbers in both directions. For example, large numbers of people move between California and Texas -- in both directions -- because those states have lots of people who could move. From Wyoming, not so many. Given a distance measurement, it turns into a statistical problem in cluster analysis: partition the states into groups so that states within a group are close to each other. Since there's only a distance measure, hierarchical clustering seems like a reasonable choice.
The map to the left shows the results of partitioning the 48 contiguous states into seven clusters. The first thing I noticed about the partition is that states are grouped into contiguous blocks, without exception. While that might be expected as a tendency [1], I thought there would be at least a couple of exceptions. The resulting regions are more than a little familiar: there's the Northest, the Mid-Atlantic, the Southeast, the Midwest (in two parts), the West, and "Greater Texas". There are a couple of other surprises after reading the discussion at Ordinary Times: Kentucky is grouped with the Midwest, and Missouri and Kansas with Greater Texas. New Mexico clustered with Texas isn't surprising, but New Mexico with Louisiana and Arkansas? Hierarchical clustering is subject to a chaining effect: New Mexico may be very close to Texas, and Louisiana also close to Texas, and they get put into the same cluster even though New Mexico and Louisiana aren't very close at all.
One way to test that possibility is to remove Texas from the set of states. The result of doing that is shown to the left. As expected, New Mexico is now clustered with the other Rocky Mountain states and Louisiana with the Southeast. Perhaps less expected is that the other four states -- Arkansas, Kansas, Missouri, and Oklahoma -- remain grouped together. None of them is split off to go to other regions; the four are close to one another on the basis of the measure I'm using here.
Answers to random anticipated questions... I used seven clusters because that was the largest number possible before there was some cluster with only a single state in it [2]. The Northeast region has the greatest distance between it and any of the other regions. If the country is split into two regions, the dividing line runs down the Mississippi River. If into three, the Northeast gets split off from the rest of the East. There are undoubtedly states that should be split, ie, western Missouri (dominated by Kansas City) and eastern Missouri (dominated by St. Louis); a future project might be to work with county-level data.
[1] My implementation of hierarchical clustering works from the bottom up, starting with each state being its own cluster and merging clusters that are close. Using the particular measure I defined, close pairs of states include Minnesota/North Dakota, California/Nevada, Massachusetts/New Hampshire, and Kansas/Missouri. These agree with my perception of population flows.
[2] The singleton when eight clusters are used is New Mexico. When ten clusters are used, Michigan also becomes a singleton, and Ohio/Kentucky a stand-alone pair.
I've also encountered a variety of nifty data visualization tools that look at inter-state migration in the US, like this one and this one from Forbes. State-level data for recent years turns out to be readily available from the Census Bureau. We can define a simple distance measure: two states are close if a relatively large fraction of the population of each moves between them each year. "Relatively" because states with large population have large absolute migration numbers in both directions. For example, large numbers of people move between California and Texas -- in both directions -- because those states have lots of people who could move. From Wyoming, not so many. Given a distance measurement, it turns into a statistical problem in cluster analysis: partition the states into groups so that states within a group are close to each other. Since there's only a distance measure, hierarchical clustering seems like a reasonable choice.
The map to the left shows the results of partitioning the 48 contiguous states into seven clusters. The first thing I noticed about the partition is that states are grouped into contiguous blocks, without exception. While that might be expected as a tendency [1], I thought there would be at least a couple of exceptions. The resulting regions are more than a little familiar: there's the Northest, the Mid-Atlantic, the Southeast, the Midwest (in two parts), the West, and "Greater Texas". There are a couple of other surprises after reading the discussion at Ordinary Times: Kentucky is grouped with the Midwest, and Missouri and Kansas with Greater Texas. New Mexico clustered with Texas isn't surprising, but New Mexico with Louisiana and Arkansas? Hierarchical clustering is subject to a chaining effect: New Mexico may be very close to Texas, and Louisiana also close to Texas, and they get put into the same cluster even though New Mexico and Louisiana aren't very close at all.
One way to test that possibility is to remove Texas from the set of states. The result of doing that is shown to the left. As expected, New Mexico is now clustered with the other Rocky Mountain states and Louisiana with the Southeast. Perhaps less expected is that the other four states -- Arkansas, Kansas, Missouri, and Oklahoma -- remain grouped together. None of them is split off to go to other regions; the four are close to one another on the basis of the measure I'm using here.
Answers to random anticipated questions... I used seven clusters because that was the largest number possible before there was some cluster with only a single state in it [2]. The Northeast region has the greatest distance between it and any of the other regions. If the country is split into two regions, the dividing line runs down the Mississippi River. If into three, the Northeast gets split off from the rest of the East. There are undoubtedly states that should be split, ie, western Missouri (dominated by Kansas City) and eastern Missouri (dominated by St. Louis); a future project might be to work with county-level data.
[1] My implementation of hierarchical clustering works from the bottom up, starting with each state being its own cluster and merging clusters that are close. Using the particular measure I defined, close pairs of states include Minnesota/North Dakota, California/Nevada, Massachusetts/New Hampshire, and Kansas/Missouri. These agree with my perception of population flows.
[2] The singleton when eight clusters are used is New Mexico. When ten clusters are used, Michigan also becomes a singleton, and Ohio/Kentucky a stand-alone pair.
Sunday, July 6, 2014
An Update on the War on Coal
[A longer version of this post appeared at Ordinary Times.]
It's been a tough year for coal in the United States. I generally dislike the use of war-on-this and war-on-that. But if the intended meaning is "make it much more difficult and/or expensive to continue burning large quantities of coal to produce electricity," then the phrase is accurate. Where most people who use it are wrong though, is just who it is that's fighting the war. It's the federal courts, and to a lesser degree some of the individual states. The EPA is just the tool through which the courts are acting. Well, also ghosts of Congresses past, who left us with various environmental protection statutes in their current form. Since the SCOTUS hammered the coal side of the fight twice this just-concluded term, it seems like a good time to write a little status report.
Not all the constituents of coal are combustible. Anywhere from 3% and up are not and are left behind as ash, and even 3% of a billion tons is a lot of ash. A bit more than 40% of coal ash is typically reused in various ways: some of it can replace Portland cement in the right circumstances, some it can be used as fill for roadbeds, etc. The remainder winds up in landfills or ash ponds. Ash ponds contain an ash/water slurry; the wet ash stays where it's put rather than being blown away by the wind. Ash pond spills are becoming more common. The federal EPA has not regulated ash ponds in the past; in January this year the DC District Court accepted a consent decree between the EPA and several plaintiffs that requires the EPA to issue final findings on ash pond problems by December. The expectation is that the findings will lead to significant new regulation, and increased spending on both existing and future ash ponds. Things are also happening at the state level. The North Carolina Senate unanimously approved a bill last week that would require the closure of all coal ash ponds in the state over the next 15 years. NC's not exactly one of your liberal Northeastern or Pacific Coast states.
Most of the visible pollutants that go up the flue at coal-fired plants have been eliminated. The picture to the left is the Intermountain generating station near Delta, Utah. The visible white stuff escaping from the stack is steam. Not visible are things like mercury compounds, sulfur and nitrous oxides, and extremely small particles of soot. Those are all precursors to haze, smog, low-level ozone and acid rain, as well as being direct eye, nose, throat and lung irritants. Some of these pollutants can travel significant distances in the open air. In April this year, a three-judge panel of the DC Circuit upheld a tougher rule for emissions of this type of pollutant (the MATS rule). Also in April, the SCOTUS approved the EPA's Cross State Air Pollution Rule that will result in tighter controls on this type of emission. Approval of the cross-state rule has been a long time coming, as EPA rules that would regulate cross-state sources made multiple trips up and down the court system. The courts have always held that the EPA should regulate cross-state pollutants; the problem has been finding a technical approach that would satisfy the courts. In EPA v. EME Homer in April, the SCOTUS reversed the DC Circuit, and the CSAPR will now go into effect.
Finally, last week the Supreme Court issued its opinion in the case of Utility Air Regulatory Group v. EPA. This opinion confirmed the Court's 2009 opinion in Massachusetts v. EPA that the EPA must regulate greenhouse gases. Massachusetts was a suit brought by several states against the Bush EPA, which had decided the carbon dioxide was not harmful. I think Utility is an odd opinion, cobbled together out of three different factions on the court (more about that in a moment). The opinion has three conclusions: (a) the EPA can and must regulate greenhouse gas emissions from stationary sources, (b) the EPA can only regulate greenhouse gas emissions from stationary sources if those sources would have been regulated for non-greenhouse emissions anyway, and (c) the somewhat controversial approach the EPA is taking to the regulation is acceptable. The last one seems to me to have been sort of an afterthought. OTOH, it's likely that we'll see a number of cases about it later when the states make the details of their individual plans known.
The results of the various court decisions are going to have very different effects on different states. Compare California and North Carolina, to pick two (not exactly at random). North Carolina has 43 coal ash ponds; California has none. North Carolina, despite being a much smaller state, generates more than 30 times as much electricity from coal as California; the MATS rule will require much more effort to meet in North Carolina. The CSAPR does not apply to California; but North Carolina power plants will be required to make reductions to improve air quality in downwind states. North Carolina has to reduce the CO2 intensity of its generating plants by more than the national average; California's required reduction is much less than the average, and decisions that California has already made at the state level will probably be sufficient to meet the EPA requirements. North Carolina's electricity rates are likely, it seems to me, to be noticeably higher in the future; California's rates will remain high and perhaps go higher, but aren't going to be driven by these decisions.
It's been a tough year for coal in the United States. I generally dislike the use of war-on-this and war-on-that. But if the intended meaning is "make it much more difficult and/or expensive to continue burning large quantities of coal to produce electricity," then the phrase is accurate. Where most people who use it are wrong though, is just who it is that's fighting the war. It's the federal courts, and to a lesser degree some of the individual states. The EPA is just the tool through which the courts are acting. Well, also ghosts of Congresses past, who left us with various environmental protection statutes in their current form. Since the SCOTUS hammered the coal side of the fight twice this just-concluded term, it seems like a good time to write a little status report.
Not all the constituents of coal are combustible. Anywhere from 3% and up are not and are left behind as ash, and even 3% of a billion tons is a lot of ash. A bit more than 40% of coal ash is typically reused in various ways: some of it can replace Portland cement in the right circumstances, some it can be used as fill for roadbeds, etc. The remainder winds up in landfills or ash ponds. Ash ponds contain an ash/water slurry; the wet ash stays where it's put rather than being blown away by the wind. Ash pond spills are becoming more common. The federal EPA has not regulated ash ponds in the past; in January this year the DC District Court accepted a consent decree between the EPA and several plaintiffs that requires the EPA to issue final findings on ash pond problems by December. The expectation is that the findings will lead to significant new regulation, and increased spending on both existing and future ash ponds. Things are also happening at the state level. The North Carolina Senate unanimously approved a bill last week that would require the closure of all coal ash ponds in the state over the next 15 years. NC's not exactly one of your liberal Northeastern or Pacific Coast states.
Most of the visible pollutants that go up the flue at coal-fired plants have been eliminated. The picture to the left is the Intermountain generating station near Delta, Utah. The visible white stuff escaping from the stack is steam. Not visible are things like mercury compounds, sulfur and nitrous oxides, and extremely small particles of soot. Those are all precursors to haze, smog, low-level ozone and acid rain, as well as being direct eye, nose, throat and lung irritants. Some of these pollutants can travel significant distances in the open air. In April this year, a three-judge panel of the DC Circuit upheld a tougher rule for emissions of this type of pollutant (the MATS rule). Also in April, the SCOTUS approved the EPA's Cross State Air Pollution Rule that will result in tighter controls on this type of emission. Approval of the cross-state rule has been a long time coming, as EPA rules that would regulate cross-state sources made multiple trips up and down the court system. The courts have always held that the EPA should regulate cross-state pollutants; the problem has been finding a technical approach that would satisfy the courts. In EPA v. EME Homer in April, the SCOTUS reversed the DC Circuit, and the CSAPR will now go into effect.
Finally, last week the Supreme Court issued its opinion in the case of Utility Air Regulatory Group v. EPA. This opinion confirmed the Court's 2009 opinion in Massachusetts v. EPA that the EPA must regulate greenhouse gases. Massachusetts was a suit brought by several states against the Bush EPA, which had decided the carbon dioxide was not harmful. I think Utility is an odd opinion, cobbled together out of three different factions on the court (more about that in a moment). The opinion has three conclusions: (a) the EPA can and must regulate greenhouse gas emissions from stationary sources, (b) the EPA can only regulate greenhouse gas emissions from stationary sources if those sources would have been regulated for non-greenhouse emissions anyway, and (c) the somewhat controversial approach the EPA is taking to the regulation is acceptable. The last one seems to me to have been sort of an afterthought. OTOH, it's likely that we'll see a number of cases about it later when the states make the details of their individual plans known.
The results of the various court decisions are going to have very different effects on different states. Compare California and North Carolina, to pick two (not exactly at random). North Carolina has 43 coal ash ponds; California has none. North Carolina, despite being a much smaller state, generates more than 30 times as much electricity from coal as California; the MATS rule will require much more effort to meet in North Carolina. The CSAPR does not apply to California; but North Carolina power plants will be required to make reductions to improve air quality in downwind states. North Carolina has to reduce the CO2 intensity of its generating plants by more than the national average; California's required reduction is much less than the average, and decisions that California has already made at the state level will probably be sufficient to meet the EPA requirements. North Carolina's electricity rates are likely, it seems to me, to be noticeably higher in the future; California's rates will remain high and perhaps go higher, but aren't going to be driven by these decisions.
Monday, June 23, 2014
Infrastructure Needs - A Cartogram
From time to time you find articles that talk about how far behind the United States is in infrastructure spending. The American Society of Civil Engineers maintains an entire web site dedicated to the topic. I have often wondered whether there are geographic patterns to the infrastructure shortfalls. One of the things that raises that question for me is the stuff I read about how the electric grid is falling apart. In the Denver suburb where I have lived for the last 26 years, and the Front Range generally, there's been an enormous amount invested in the electric grid and service seems to be noticeably improved compared to what it was when I moved here.
Bloomberg maintains an interesting collection of state-by-state numbers, including infrastructure needs. They only consider a limited number of things: roads, drinking water, and airports. I'd like to have figures that included more factors — ah, how pleasant it would be to haveminionsgraduate students to do the grunt work — but Bloomberg is an easy-to-use starting point. The cartogram at left — double-click in most browsers for a larger version — shows US states sized to reflect Bloomberg's figure for annual per-capita infrastructure spending needs for the period 2013-2017. West Virginia has the largest value at $1,035; New Jersey has the lowest value at $78. While there are lots of things that would be interesting to regress against the numbers, in this essay I'm just thinking about geography.
One of the obvious things that jumps out is that high-population states do better. California, New York, Florida, and Illinois all fall into that group. The most likely reason would seem to be that there are economies of scale involved in the things Bloomberg measures. An airport can serve more people in a high-population state; highway lane-miles are used by more people; doubling the capacity of a water- or sewage-treatment plant doesn't mean that the cost of the plant will be doubled.
Another factor appears to be that states with high population growth over the last 20 years do better. California, Colorado, Texas, Georgia and Florida are examples. In this case, the likely reason would be that rapidly growing populations have made infrastructure spending a critical need. To use Colorado as an anecdotal case, since I live here and pay some attention, Denver built a major new airport, my suburb greatly expanded its water-treatment plant, and I-25 along the Front Range has been subject to an entire series of improvements (if you drive its length, it's not whether part of it is under construction, it's a matter of how much).
Finally, some simple regional observations. Here's a standard map of the 48 contiguous states using an equal-area projection for comparison. The 11 contiguous western states appear to be in much better shape than the country as a whole. As might be expected, Wyoming and Montana, the two western states with the smallest populations, do the worst in that region by a wide margin. With a caveat that this is state-level data, the upper Great Plains, New England, and Appalachia all do very poorly. With a small number of exceptions, the East Coast looks particularly bad.
I think I'll just sum this up with the obvious statement: "There are new parts of the country, and old parts of the country, and the new parts tend to have more and shinier stuff per person."
Bloomberg maintains an interesting collection of state-by-state numbers, including infrastructure needs. They only consider a limited number of things: roads, drinking water, and airports. I'd like to have figures that included more factors — ah, how pleasant it would be to have
One of the obvious things that jumps out is that high-population states do better. California, New York, Florida, and Illinois all fall into that group. The most likely reason would seem to be that there are economies of scale involved in the things Bloomberg measures. An airport can serve more people in a high-population state; highway lane-miles are used by more people; doubling the capacity of a water- or sewage-treatment plant doesn't mean that the cost of the plant will be doubled.
Another factor appears to be that states with high population growth over the last 20 years do better. California, Colorado, Texas, Georgia and Florida are examples. In this case, the likely reason would be that rapidly growing populations have made infrastructure spending a critical need. To use Colorado as an anecdotal case, since I live here and pay some attention, Denver built a major new airport, my suburb greatly expanded its water-treatment plant, and I-25 along the Front Range has been subject to an entire series of improvements (if you drive its length, it's not whether part of it is under construction, it's a matter of how much).
Finally, some simple regional observations. Here's a standard map of the 48 contiguous states using an equal-area projection for comparison. The 11 contiguous western states appear to be in much better shape than the country as a whole. As might be expected, Wyoming and Montana, the two western states with the smallest populations, do the worst in that region by a wide margin. With a caveat that this is state-level data, the upper Great Plains, New England, and Appalachia all do very poorly. With a small number of exceptions, the East Coast looks particularly bad.
I think I'll just sum this up with the obvious statement: "There are new parts of the country, and old parts of the country, and the new parts tend to have more and shinier stuff per person."
Sunday, June 22, 2014
Western Secession 7 -- The Age of Electricity
Lots of people who believe that Civilization is Doomed because of energy constraints talk about this being the Age of Petroleum. As far as transportation goes, that's absolutely true. But it's not really the most critical aspect of our current high-tech society. This is really the Age of Electricity.
If petroleum were to slowly go away, eventually reaching zero, there are alternatives. Existing land transportation can become much more efficient: smaller vehicles, trains instead of trucks, etc. Alternate power sources are available, at least for some applications: smaller electric vehicles, electric trains instead of diesel, etc. Goods can be produced closer to where they are consumed in order to reduce the amount of transportation required to deliver them. Slower transportation and delivery of objects helps — delivering the small package by electric train and electric vehicle uses much less energy than flying the package overnight. More expensive synthetic alternatives to petroleum-based liquid fuels exist for situations where alternatives are impractical.
OTOH, we have reached a point where there is no substitute for electricity. This short essay was written on a computer; it was uploaded to Blogspot and copied into some number of their computers; the copy you're reading was downloaded from one of those servers. The non-electric alternative is paper and ink (since radio and television also require electricity in large quantities). If that were the only medium available, chances are that you would never see this. Paper and ink distribution imposes serious limits on how many people's writing gets distributed widely. I'm using "widely" in the sense of making it possible for many people in many locations to read it. Having the NY Times publish it would count, as the NY Times is popular enough to have nation-wide physical distribution (although without electricity, that may mean being a day or two behind). Putting it up on a wall in a public place in Arvada, CO doesn't count.
Recall that the original "wire services" that distributed stories to local newspapers were called that because it was a description of what they did. Stories were collected and distruted by telegraph and later teletype, both of which require electricity. David Weber's popular Safehold series of science fiction novels, set on a world where the use of electricity is strictly forbidden, envisions a semaphore network instead. It's slow, it's even slower at night, it fails temporarily when the weather is bad enough, and it fails completely when the message has to cross a large enough body of water (discounting transcribing the content, moving it physically across the water, then putting it back on the semaphore network). High-speed communication means electricity.
Electricity is a key consideration in developing countries as well, with China as the most interesting case. Their population is extremely large. As recently as three decades ago, that population was desperately poor. The government is working — at a pretty hectic pace — to urbanize and find non-farming work for what was an enormous peasant population. In order to do that, electricity is vital. As a result, if it will generate electricity, China is deploying lots of it. Coal, natural gas, nuclear, hydro, wind, solar... China's rate of growth in the use of all of those is among the very highest in the world. India has been less aggressive about expanding its grid, leading the head of one of that country's software development firms to say, "Job one is acquiring the diesel fuel to power our private generators; job two is writing software, and doesn't happen if we fail at job one."
Dependency on electricity has been greatly increased by the integrated circuit revolution. The common design approach for an enormous range of things is now a processor, a batch of sensors (some as simple as push buttons), and a handful of actuators. All of the difficult parts are implemented using software. Television is now digital, and depends on billion-transistor integrated circuits for every step from source to final viewing by the consumer. Film has disappeared. Music is (at least the vast majority is) delivered in digital formats dependent on those same integrated circuits. The banking system depends on computers to run the check clearing house, the stock markets are all electronic, the Post Office depends on computers to read addresses and route mail... I told my bosses at Bell Labs that it was a software world back in the late 1970s; it has only become more so.
All of this may seem trivially obvious, but any plan to ensure that modern technology continues on into the future depends on maintaining robust reliable supplies of electricity. The next post in this series will look at where the US gets its electricity today.
If petroleum were to slowly go away, eventually reaching zero, there are alternatives. Existing land transportation can become much more efficient: smaller vehicles, trains instead of trucks, etc. Alternate power sources are available, at least for some applications: smaller electric vehicles, electric trains instead of diesel, etc. Goods can be produced closer to where they are consumed in order to reduce the amount of transportation required to deliver them. Slower transportation and delivery of objects helps — delivering the small package by electric train and electric vehicle uses much less energy than flying the package overnight. More expensive synthetic alternatives to petroleum-based liquid fuels exist for situations where alternatives are impractical.
OTOH, we have reached a point where there is no substitute for electricity. This short essay was written on a computer; it was uploaded to Blogspot and copied into some number of their computers; the copy you're reading was downloaded from one of those servers. The non-electric alternative is paper and ink (since radio and television also require electricity in large quantities). If that were the only medium available, chances are that you would never see this. Paper and ink distribution imposes serious limits on how many people's writing gets distributed widely. I'm using "widely" in the sense of making it possible for many people in many locations to read it. Having the NY Times publish it would count, as the NY Times is popular enough to have nation-wide physical distribution (although without electricity, that may mean being a day or two behind). Putting it up on a wall in a public place in Arvada, CO doesn't count.
Recall that the original "wire services" that distributed stories to local newspapers were called that because it was a description of what they did. Stories were collected and distruted by telegraph and later teletype, both of which require electricity. David Weber's popular Safehold series of science fiction novels, set on a world where the use of electricity is strictly forbidden, envisions a semaphore network instead. It's slow, it's even slower at night, it fails temporarily when the weather is bad enough, and it fails completely when the message has to cross a large enough body of water (discounting transcribing the content, moving it physically across the water, then putting it back on the semaphore network). High-speed communication means electricity.
Electricity is a key consideration in developing countries as well, with China as the most interesting case. Their population is extremely large. As recently as three decades ago, that population was desperately poor. The government is working — at a pretty hectic pace — to urbanize and find non-farming work for what was an enormous peasant population. In order to do that, electricity is vital. As a result, if it will generate electricity, China is deploying lots of it. Coal, natural gas, nuclear, hydro, wind, solar... China's rate of growth in the use of all of those is among the very highest in the world. India has been less aggressive about expanding its grid, leading the head of one of that country's software development firms to say, "Job one is acquiring the diesel fuel to power our private generators; job two is writing software, and doesn't happen if we fail at job one."
Dependency on electricity has been greatly increased by the integrated circuit revolution. The common design approach for an enormous range of things is now a processor, a batch of sensors (some as simple as push buttons), and a handful of actuators. All of the difficult parts are implemented using software. Television is now digital, and depends on billion-transistor integrated circuits for every step from source to final viewing by the consumer. Film has disappeared. Music is (at least the vast majority is) delivered in digital formats dependent on those same integrated circuits. The banking system depends on computers to run the check clearing house, the stock markets are all electronic, the Post Office depends on computers to read addresses and route mail... I told my bosses at Bell Labs that it was a software world back in the late 1970s; it has only become more so.
All of this may seem trivially obvious, but any plan to ensure that modern technology continues on into the future depends on maintaining robust reliable supplies of electricity. The next post in this series will look at where the US gets its electricity today.
Monday, June 16, 2014
A Thought on the EPA's New CO2 Rule
As described in an earlier Grist piece, each state will have its own target for reduction of CO2 emissions, and each state will be allowed to develop its own plan for achieving the necessary reduction. Washington will have to reduce its emissions by about 70%; North Dakota will only have to reduce its emissions by about 10%. The EPA formula(s) (PDF) for calculating the required emissions targets are complicated and consider a number of factors.
One of the factors that is not included is where the electricity is consumed. Some states produce more electricity than they consume, others produce less. The graph to the left shows the approximate net exports for each state, in megawatt-hours [1]. California is at the top, with a negative value indicating they are a large importer of electricity. Pennsylvania is at the other end of the chart and is the largest exporter.
Tracking exports and imports in more detail can be difficult. Some cases are relatively straightforward. Xcel Energy owns the coal-fired Comanche power plant in Pueblo, CO and sells the electricity generated there to consumers up and down the Front Range. The 1.9 GW coal-fired Intermountain power plant in Utah is owned by utilities in California and Utah. 75% of the plant's output goes by HVDC transmission directly to San Bernardino County, CA; the remainder goes to utilities and electricity cooperatives in Utah. The coal-fired Jim Bridger power plant in Wyoming is owned by Berkshire Hathaway and sells its output to two utilities operating across six states. Oregon is one of those states. Oregon is a net exporter of electricity, primarily hydro electricity sold to utilities in California. Oregon generates a modest amount of in-state power from coal and imports coal-fired electricity from Wyoming and Utah.
Reducing CO2 emissions will require that money be spent on coal-fired power plants -- on sequestration technology, or on efficiency improvements [2], or on fuel conversions. That money will eventually be collected from the pocketbooks of electricity consumers. The fact that there are states that are exporters and importers of electricity would seem, at least to me, to create an opportunity for a certain amount of mischief. That is, a state's plan for reducing CO2 emissions might be structured so that, as far as is possible, out-of-state consumers pay for the necessary changes. From the examples in the preceding paragraph, Wyoming and Utah have an interest in getting California and Oregon to foot as much of the bills as possible. In addition to Ben Adler's list of reasons that the EPA's final rule will end up in court, look for the distinct possibility of some states (and interstate companies) suing other states over their plans.
Myself, I'm in the camp that says, "A carbon tax would have been enormously simpler." Reality, though, forces me to acknowledge that politics is the art of the possible, that such a tax would be DOA in Congress, and that not allowing Congress to delegate taxes and tax rates to the EPA is a good thing.
[1] Data from the EIA's state electricity profiles for calendar year 2012, total net generation minus total retail sales. For the US as a whole, net generation exceeds retail sales by about 10%. Each state's generation figure is scaled down by the US ratio so that the US Total exports comes out zero.
[2] An older conventional coal-fired plant may have 30% thermal efficiency. That is, 30% of the heat energy released by burning the coal is converted to electricity. New technology may achieve 45% thermal efficiency. Such technology would lower CO2 emissions by 33% for the same amount of electricity.
Sunday, May 25, 2014
What Is It With Economists and Spreadsheets?
The hot book in economics this year is Thomas Piketty's Capital in the Twenty-First Century. The book addresses topics of inequality, asserting that inappropriate levels of such are a natural outgrowth of capitalism. It makes sweeping policy proposals, such as a global tax on wealth and much higher income tax rates on the upper income brackets. It has been favorably reviewed by liberal economists like Paul Krugman, and criticized by conservative economists. Recently we learned one more thing about it, as reported by the Financial Times: calculations critical to the argument relied on error-plagued spreadsheets.
Last year we went through the episode of the Reinhart-Rogoff paper that asserted that national public debts in excess of 90% of GDP killed economic growth. Conservative politicians jumped on the paper as a justification for proposing drastic changes in US public policy. As it turned out, critical calculations were done by spreadsheet, the spreadsheet had errors, and when the errors were corrected the "cliff" in economic growth at 90% disappeared. The damage had already been done, though. The 90% debt-to-GDP cliff was quoted in a variety of government reports, and those secondary sources continue to be cited in policy debates today.
What is it with economists and spreadsheets? Spreadsheet software is a programming system. There is a large literature on the frequency and nature of spreadsheet errors (the European Spreadsheet Risks Interest Group's annual conference on the subject will be held in July this year). Even using best practices, complex spreadsheets contain errors at a rate that would be completely unacceptable in any other programming environment. Not that there seems to be any evidence being offered that the economists mentioned above were making use of those best practices. For example, I have yet to read that Reinhart and Rogoff conducted formal code reviews.
One part of this is particularly puzzling to me. Some years back I spent two semesters in a PhD economics program. The econometrics classes used R and Gauss. There was never even a hint that Excel was an acceptable method for doing research calculations. Certainly none of the graduate students I met who were working on their dissertations were using a spreadsheet to do the analysis. So to find highly-respected academic economists using spreadsheets is surprising. They have to know that if the spreadsheet is even moderately complex, errors are creeping in. Quite possibly embarrassing errors [1].
For some decades, economists have been accused of suffering from "physics envy." That is, they want their field to be considered a hard science like physics, not one of the so-called soft sciences. I'll offer economists a piece of free advice on that: hard sciences don't do data analysis with spreadsheets. Clean up your act. Require authors to certify that they use real tools for numerical work, reject out-of-hand papers that don't, and punish people who lie about it harshly.
[1] Using better tools is no guarantee that errors won't creep in. But they offer a better chance of catching them.
Last year we went through the episode of the Reinhart-Rogoff paper that asserted that national public debts in excess of 90% of GDP killed economic growth. Conservative politicians jumped on the paper as a justification for proposing drastic changes in US public policy. As it turned out, critical calculations were done by spreadsheet, the spreadsheet had errors, and when the errors were corrected the "cliff" in economic growth at 90% disappeared. The damage had already been done, though. The 90% debt-to-GDP cliff was quoted in a variety of government reports, and those secondary sources continue to be cited in policy debates today.
What is it with economists and spreadsheets? Spreadsheet software is a programming system. There is a large literature on the frequency and nature of spreadsheet errors (the European Spreadsheet Risks Interest Group's annual conference on the subject will be held in July this year). Even using best practices, complex spreadsheets contain errors at a rate that would be completely unacceptable in any other programming environment. Not that there seems to be any evidence being offered that the economists mentioned above were making use of those best practices. For example, I have yet to read that Reinhart and Rogoff conducted formal code reviews.
One part of this is particularly puzzling to me. Some years back I spent two semesters in a PhD economics program. The econometrics classes used R and Gauss. There was never even a hint that Excel was an acceptable method for doing research calculations. Certainly none of the graduate students I met who were working on their dissertations were using a spreadsheet to do the analysis. So to find highly-respected academic economists using spreadsheets is surprising. They have to know that if the spreadsheet is even moderately complex, errors are creeping in. Quite possibly embarrassing errors [1].
For some decades, economists have been accused of suffering from "physics envy." That is, they want their field to be considered a hard science like physics, not one of the so-called soft sciences. I'll offer economists a piece of free advice on that: hard sciences don't do data analysis with spreadsheets. Clean up your act. Require authors to certify that they use real tools for numerical work, reject out-of-hand papers that don't, and punish people who lie about it harshly.
[1] Using better tools is no guarantee that errors won't creep in. But they offer a better chance of catching them.
Wednesday, March 26, 2014
Mike the Pythoneer...
A few weeks back my old Mac Mini got to the point where it was giving me the "gray screen of death" every 18 to 36 hours [1]. Replacing the RAM -- failing RAM being the most frequent cause of kernel panics -- didn't fix the problem. I decided that six-and-a-half years was a good run, that I had outgrown the RAM size limit, that having a graphics chip too dumb to support acceleration for OpenGL wasn't good, and that having no OS upgrade path for the old machine was a bad thing. So I got a new Mini. It has not been an entirely painless process, largely because there have been a bunch of changes in the Apple development tools.
I have a couple of pieces of software that I use regularly that I wrote in Perl. I'm entirely dependent on one of them, a note-taking application that also acts as the front-end for (sort of) organizing a whole collection of files of various types -- PDFs, images, old source code -- that is pretty static these days [2]. Another one, that is part of a package for drawing cartograms, is under off-and-on development. Both were broken under the default Perl/Tcl/Tk that comes as a standard part of the new OS X, and required some effort to get running again. To get the note-taking app running I ended up downloading and using ActiveState's Perl instead of the Apple default. At one point in the past I had toyed with the idea rewriting the note-taker in Python (for other reasons) and had some code that tested all of the necessary GUI functionality; that old Python code ran on the new Mini with no problems using the Apple default installation.
Reading a variety of Internet things led me to (perhaps erroneously) conclude that: Apple is moving away from Perl and towards Python for scripting; Tkinter is a required part of the Python core, but Perl has no required GUI module; so Apple is more likely to keep the chain of Python/Tcl/Tk working in the future. Suddenly, switching to Python seemed much more compelling than before. I also came across Eric Raymond's old "Why Python?" article from the Linux Journal. His experience seemed to match my own recollections of writing that little Python program related to the note-taker: I got over the "eeewww" thing about the use of white space to identify block structure fairly quickly, and I seemed to be writing useful non-toy code fairly quickly.
One of my favorite (and I use that word with some ambiguity) time-wasters on my computer is FreeCell. Since Windows 95, Microsoft has always provided a simple version of the game as part of the standard distribution. When I switched to a Mac at home several years ago, the lack of a simple free reasonably-attractive version of the game grated on me [3]. I ended up using a Javascript version in my browser, but recently that one began to act flaky, putting up placeholders instead of images for some of the cards. "Two birds with one stone," I thought. "Practice Python and get a version of FreeCell that looks and behaves like I want."
The good news is that with a couple of manuals open in browser tabs, writing the game went quickly. Call it 15 hours total over three days to go from nothing to something that my fingers are almost comfortable with up and running. And by nothing, I mean just that: no cardface images, no thoughts on data structure. A blank slate. Some of that time was doing small sorts of rewriting, when I would find an example that showed a better Python idiom. The bad news comes in several parts:
[1] In the event of certain kernel panics, Mac OS X puts up a translucent overlay to block the display, along with a dialog box that says "You have to restart your machine" in several languages. Until recently, I had no idea that such a thing even existed. For the record, since it no longer has to support nearly as many processes, the old Mini has been up without a problem for more than three weeks.
[2] At last count, a few hundred pages of notes and more than 100M of images, PDFs, etc. As to why write my own when there are dozens of note-taking applications out there, let's just say that I'm an old geek and paranoid and don't like to have critical data stored in a proprietary file format.
[3] I'm sure that all of the authors of the solitaire packages out there think their games are laid out attractively. I just happen to disagree.
I have a couple of pieces of software that I use regularly that I wrote in Perl. I'm entirely dependent on one of them, a note-taking application that also acts as the front-end for (sort of) organizing a whole collection of files of various types -- PDFs, images, old source code -- that is pretty static these days [2]. Another one, that is part of a package for drawing cartograms, is under off-and-on development. Both were broken under the default Perl/Tcl/Tk that comes as a standard part of the new OS X, and required some effort to get running again. To get the note-taking app running I ended up downloading and using ActiveState's Perl instead of the Apple default. At one point in the past I had toyed with the idea rewriting the note-taker in Python (for other reasons) and had some code that tested all of the necessary GUI functionality; that old Python code ran on the new Mini with no problems using the Apple default installation.
Reading a variety of Internet things led me to (perhaps erroneously) conclude that: Apple is moving away from Perl and towards Python for scripting; Tkinter is a required part of the Python core, but Perl has no required GUI module; so Apple is more likely to keep the chain of Python/Tcl/Tk working in the future. Suddenly, switching to Python seemed much more compelling than before. I also came across Eric Raymond's old "Why Python?" article from the Linux Journal. His experience seemed to match my own recollections of writing that little Python program related to the note-taker: I got over the "eeewww" thing about the use of white space to identify block structure fairly quickly, and I seemed to be writing useful non-toy code fairly quickly.
One of my favorite (and I use that word with some ambiguity) time-wasters on my computer is FreeCell. Since Windows 95, Microsoft has always provided a simple version of the game as part of the standard distribution. When I switched to a Mac at home several years ago, the lack of a simple free reasonably-attractive version of the game grated on me [3]. I ended up using a Javascript version in my browser, but recently that one began to act flaky, putting up placeholders instead of images for some of the cards. "Two birds with one stone," I thought. "Practice Python and get a version of FreeCell that looks and behaves like I want."
The good news is that with a couple of manuals open in browser tabs, writing the game went quickly. Call it 15 hours total over three days to go from nothing to something that my fingers are almost comfortable with up and running. And by nothing, I mean just that: no cardface images, no thoughts on data structure. A blank slate. Some of that time was doing small sorts of rewriting, when I would find an example that showed a better Python idiom. The bad news comes in several parts:
- There's a version of FreeCell that my fingers know just sitting there on the desktop now, begging to be "tested".
- Feature creep is going to be an issue. It should have undo. It should have the ability remember the current layout and return to that. It should have a built-in solver that works from any position.
- It should run on my old Linux laptop. It should run on my Android tablet. It should run on my wife's iPhone (well, maybe that's a stretch).
[1] In the event of certain kernel panics, Mac OS X puts up a translucent overlay to block the display, along with a dialog box that says "You have to restart your machine" in several languages. Until recently, I had no idea that such a thing even existed. For the record, since it no longer has to support nearly as many processes, the old Mini has been up without a problem for more than three weeks.
[2] At last count, a few hundred pages of notes and more than 100M of images, PDFs, etc. As to why write my own when there are dozens of note-taking applications out there, let's just say that I'm an old geek and paranoid and don't like to have critical data stored in a proprietary file format.
[3] I'm sure that all of the authors of the solitaire packages out there think their games are laid out attractively. I just happen to disagree.
Tuesday, March 18, 2014
3D Printing
Seth Stevenson at Slate has a column about his disastrous attempts to run a 3D printer. The picture to the left is one of his attempts. His experience seems educational, given that the price of printers continues to fall. Slate apparently gives Seth a bigger hobby budget than my wife gives me, since they bought him a printer.
The Solidoodle 4 looks like a really nice little hobby machine. Steel frame and covers, compact footprint, relatively large working volume ( a cube eight inches on a side -- you can build relatively large things), ability to use either ABS or PLA plastic. Billed as being driven over a USB connection by any of Windows, Linux, or a Mac. With a retail price tag of $999, fully assembled (many low-cost 3D printers come as kits, with some degree of assembly required).
Seth's experience, though, demonstrates that's there more to 3D printing than just taking the gadget out of the packaging and firing it up. Calibration, cleaning, temperature selection, little details like sometimes spraying the bed on which the object is printed with hair spray so the plastic adheres properly, putting up with the noise and stinks. If the experiment had been more successful, there would have been the longer-term problems of maintaining an inventory of materials (ie, do you have enough red plastic on hand to make that spiffy Christmas tree ornament?). The list of problems reminds me of homemade printed circuit boards.
You can make a printed circuit board at home in your garage or basement [1] in a matter of hours. The results look like homemade PCBs. OTOH, if you need to make PCBs infrequently, you can upload the computer files to a service bureau who will, at a cost of a couple dollars per square inch, make the board for you and mail it to your home. You get a much higher quality result, at the cost of time (a couple of weeks is typical) and a bit more expense. When I built the World's Most Sophisticated Whole-House Fan Controller™ a few years ago, I had a service bureau make my boards, and was very happy with the entire experience [2].
There are a rapidly growing number of 3D printing service companies. Generally speaking, they're going to have better printers than I could possibly afford -- unless I become a service company myself -- because they keep them busy; they already know things like the proper temperature setting for all of the materials they use; they have experience with the idiosyncrasies of their equipment; and they keep suitable stocks of material on hand. I would be very interested in reading a column where Seth tries that route, as well. Certainly for time being, any 3D printing experiments I conduct will be done through a service bureau.
[1] Given the nature of the chemicals and the stains they can make, don't try this in your kitchen. Trust me on this.
[2] Some of that might be that I ordered two copies but they sent me four for the agreed-upon price. Since I eventually used three of the boards (due to operator error and a voltage surge), that turned out very well.
The Solidoodle 4 looks like a really nice little hobby machine. Steel frame and covers, compact footprint, relatively large working volume ( a cube eight inches on a side -- you can build relatively large things), ability to use either ABS or PLA plastic. Billed as being driven over a USB connection by any of Windows, Linux, or a Mac. With a retail price tag of $999, fully assembled (many low-cost 3D printers come as kits, with some degree of assembly required).
Seth's experience, though, demonstrates that's there more to 3D printing than just taking the gadget out of the packaging and firing it up. Calibration, cleaning, temperature selection, little details like sometimes spraying the bed on which the object is printed with hair spray so the plastic adheres properly, putting up with the noise and stinks. If the experiment had been more successful, there would have been the longer-term problems of maintaining an inventory of materials (ie, do you have enough red plastic on hand to make that spiffy Christmas tree ornament?). The list of problems reminds me of homemade printed circuit boards.
You can make a printed circuit board at home in your garage or basement [1] in a matter of hours. The results look like homemade PCBs. OTOH, if you need to make PCBs infrequently, you can upload the computer files to a service bureau who will, at a cost of a couple dollars per square inch, make the board for you and mail it to your home. You get a much higher quality result, at the cost of time (a couple of weeks is typical) and a bit more expense. When I built the World's Most Sophisticated Whole-House Fan Controller™ a few years ago, I had a service bureau make my boards, and was very happy with the entire experience [2].
There are a rapidly growing number of 3D printing service companies. Generally speaking, they're going to have better printers than I could possibly afford -- unless I become a service company myself -- because they keep them busy; they already know things like the proper temperature setting for all of the materials they use; they have experience with the idiosyncrasies of their equipment; and they keep suitable stocks of material on hand. I would be very interested in reading a column where Seth tries that route, as well. Certainly for time being, any 3D printing experiments I conduct will be done through a service bureau.
[1] Given the nature of the chemicals and the stains they can make, don't try this in your kitchen. Trust me on this.
[2] Some of that might be that I ordered two copies but they sent me four for the agreed-upon price. Since I eventually used three of the boards (due to operator error and a voltage surge), that turned out very well.
Monday, March 10, 2014
TABOR Lawsuit to Procede?
You know you're a policy wonk of some sort when you find the following of great interest: last Friday the 10th Circuit US Court of Appeals ruled [PDF] that the Colorado TABOR lawsuit could go forward. No word yet on whether the defendants in the case will appeal to the SCOTUS or not. For those to whom this might have passing interest...
TABOR, the Taxpayers Bill of Rights, is an amendment to the Colorado state constitution passed in 1992 which imposed two main restrictions on state and local governments across the state. First, it restricted the rate at which government spending could increase year-over-year to the sum of population growth plus inflation, unless increases beyond that were approved by a vote of the people in the jurisdiction. Excess revenue beyond what could be spent had to be refunded. Second, it required that new taxes, or increases to existing tax rates, must also be approved by a vote of the people. The effect was to impose direct-democracy restrictions on the ability of elected government officials to set tax rates and spending levels.
Whether by design or coincidence, the most serious consequences of the TABOR restrictions come into play during recessions. Following the recession of 2001, the Colorado state budget was in sufficiently dire straits that the legislature put Referendum C on the ballot in 2005. Ref C provided a five-year time-out on TABOR refunds, and set a new floor from which future spending increases would be measured. Ref C was supported by then-governor Bill Owens; many people believe that support cost Mr. Owens a bright future on the Republican national stage. The recession of 2007-09 also put large pressures on the state budget. In response to the pressures brought on by recessions, the Colorado legislature has resorted to a certain amount of accounting trickery: the Colorado Opportunity Fund shields a considerable part of state higher education funding from TABOR, as did a rewrite of the unemployment insurance statutes to replace the word "tax" with "premium" wherever it appeared.
In 2011, a group of current and former members of the state General Assembly filed suit in federal court challenging TABOR on the grounds that by removing control of taxes and spending levels from the legislature, Coloradans were denied the "Republican Form of Government" guaranteed by the US Constitution. To this point, the argument has been whether the plaintiffs have standing to sue, and whether the entire matter is non-justiciable due to the political question doctrine. In 2012, the District Court ruled that (a) plaintiffs had standing and (b) this case is sufficiently different from others settled in the past that the political-question doctrine did not apply. Last Friday, the Appeals Court agreed, and remanded the case to the District Court for further proceedings.
Opinions. Bear in mind that IANAL (although I had to pretend to be one at time while I was working for the state legislature), nor without bias in this matter:
TABOR, the Taxpayers Bill of Rights, is an amendment to the Colorado state constitution passed in 1992 which imposed two main restrictions on state and local governments across the state. First, it restricted the rate at which government spending could increase year-over-year to the sum of population growth plus inflation, unless increases beyond that were approved by a vote of the people in the jurisdiction. Excess revenue beyond what could be spent had to be refunded. Second, it required that new taxes, or increases to existing tax rates, must also be approved by a vote of the people. The effect was to impose direct-democracy restrictions on the ability of elected government officials to set tax rates and spending levels.
Whether by design or coincidence, the most serious consequences of the TABOR restrictions come into play during recessions. Following the recession of 2001, the Colorado state budget was in sufficiently dire straits that the legislature put Referendum C on the ballot in 2005. Ref C provided a five-year time-out on TABOR refunds, and set a new floor from which future spending increases would be measured. Ref C was supported by then-governor Bill Owens; many people believe that support cost Mr. Owens a bright future on the Republican national stage. The recession of 2007-09 also put large pressures on the state budget. In response to the pressures brought on by recessions, the Colorado legislature has resorted to a certain amount of accounting trickery: the Colorado Opportunity Fund shields a considerable part of state higher education funding from TABOR, as did a rewrite of the unemployment insurance statutes to replace the word "tax" with "premium" wherever it appeared.
In 2011, a group of current and former members of the state General Assembly filed suit in federal court challenging TABOR on the grounds that by removing control of taxes and spending levels from the legislature, Coloradans were denied the "Republican Form of Government" guaranteed by the US Constitution. To this point, the argument has been whether the plaintiffs have standing to sue, and whether the entire matter is non-justiciable due to the political question doctrine. In 2012, the District Court ruled that (a) plaintiffs had standing and (b) this case is sufficiently different from others settled in the past that the political-question doctrine did not apply. Last Friday, the Appeals Court agreed, and remanded the case to the District Court for further proceedings.
Opinions. Bear in mind that IANAL (although I had to pretend to be one at time while I was working for the state legislature), nor without bias in this matter:
- To this point, the defense has been remarkably lazy. Their argument consists basically of "state legislators don't get standing just because they lose a vote" and "the SCOTUS says Guarantee Clause matters are non-justiciable." In fact, the SCOTUS decisions are much more nuanced than that. Where the defense has ignored the details entirely, plaintiffs appear to have done their homework on those nuances, building arguments that all of the SCOTUS conditions for standing and justiciability are satisfied. Thus far, the District and Appeals Courts have concurred.
- In light of this, I expect the defendants to appeal to the SCOTUS. I say that because I expect that the defense has been just as lazy in preparing arguments in the event that the case goes forward. Better odds for success at this point for getting a conservative SCOTUS -- and politically, TABOR is a darling of small-government conservatives -- to say the Appeals Court misinterpreted things and toss the whole case.
- Should the case go forward, the defense argument will largely be "the legislature gets to split the pie up however they want, subject to other restrictions that aren't being challenged in this suit -- they just don't get to set the size of the pie." And that such a restriction doesn't impose an undue burden, since they can always ask the voters to make the pie bigger (although in Colorado, making such a request requires a two-thirds super-majority in each chamber).
- Should the case go forward, and the defendants make that argument, they will lose -- setting the size of the pie will be defined to be a core legislative function. And lose again on appeal. And then win in the SCOTUS, assuming the current make-up.
Saturday, March 1, 2014
A Handset? Really?
News stories today reported that President Obama spent 90 minutes on the phone with Russian President Putin, warning him about the consequences of taking military action in Ukraine. This is the picture that accompanied one such story. Seriously, dude, do you want me to believe that you're dumb enough to spend 90 minutes on a important diplomatic call holding a handset?
Over the course of my professional career(s), I spent a lot of time on long-duration phone calls. I will guarantee that if you do so, at some point in the call there comes a moment when you need both hands free. Maybe you have to type something into the computer. Maybe you need to pin the piece of paper down while you jot a note. Maybe you just need both hands free because you can't say "Vladimir, you have no idea who you are f*cking with!" without appropriate hand waving [1]. The moment comes, and you can't do the proper thing if one hand is busy holding a handset up to your ear.
Before making/taking a call that was supposed to last that long, I had my headset on. I still wear a headset even if I'm just calling my Mom, let alone making a business call. And the headset is no doubt behind the times. If I had the NSA at my beck and call, you can d*mned well be sure that I would have the world's absolute best echo cancelers in place, and I'm talking in the open air, hands waving the whole time, without benefit of a headset.
Let's get with the times. A handset on a coiled cord is not the impression we're trying to make.
[1] Just so you remember why I couldn't get elected dog catcher, let alone something that requires even more diplomacy :^)
Photo credit: Official White House photo by Pete Souza
Over the course of my professional career(s), I spent a lot of time on long-duration phone calls. I will guarantee that if you do so, at some point in the call there comes a moment when you need both hands free. Maybe you have to type something into the computer. Maybe you need to pin the piece of paper down while you jot a note. Maybe you just need both hands free because you can't say "Vladimir, you have no idea who you are f*cking with!" without appropriate hand waving [1]. The moment comes, and you can't do the proper thing if one hand is busy holding a handset up to your ear.
Before making/taking a call that was supposed to last that long, I had my headset on. I still wear a headset even if I'm just calling my Mom, let alone making a business call. And the headset is no doubt behind the times. If I had the NSA at my beck and call, you can d*mned well be sure that I would have the world's absolute best echo cancelers in place, and I'm talking in the open air, hands waving the whole time, without benefit of a headset.
Let's get with the times. A handset on a coiled cord is not the impression we're trying to make.
[1] Just so you remember why I couldn't get elected dog catcher, let alone something that requires even more diplomacy :^)
Photo credit: Official White House photo by Pete Souza
Friday, January 31, 2014
Two Nuclear Paths
In a previous post I laid some groundwork for eventually arguing that different parts of the US are following two very different policies with respect to nuclear electricity. It's not a new topic for me; I've written about the differences in the size of the role
nuclear has in powering the three electric grids in the US; that nuclear
power is much more important in the Eastern Interconnect than in the Western; and that the evidence provided by plans for new reactors suggest things will continue that way. Over the last several months there have been a number of events that reinforce my thinking on the subject.
Georgia Power placed the 900-ton bottom portion of the containment vessel for reactor unit 3 at the Vogtle power plant. Georgia is an ideal example of a state that needs nuclear power. The state has, at least compared to its growing demand, quite limited undeveloped renewable resources suitable for generating electricity. For that matter, the state has little in the way of fuel resources, depending on imports of everything. It always startles me to remember that the 3.5 GW coal-fired Scherer power plant is fueled exclusively with Wyoming Powder River Basin coal that travels 2,100 miles to be burned. Long supply lines can be fragile. The 2011 Mississippi, Missouri, and Ohio river floods were only the most recent to create various disruptions in rail transport from west to east. Nuclear fuel is much more compact and refuelings infrequent.
In a different direction, Southern California Edison notified the Nuclear Regulatory Commission that they will retire the two reactors (half of the California fleet) at the San Onefre power plant rather than repairing them. The plant was taken offline in January, 2012 when excessive wear in steam-generation tubes resulted in the release of a small amount of radioactive steam. Last summer, SCE urged conservation and the California independent system operator gave permission for two gas-fired generators to be temporarily restored to service. Those generators had been shut down as part of a clean-air effort, with plans to replace them with new generators in a better location.
Also in the Western Interconnect, an interesting study of the Columbia Generating Station (the nuclear power plant in Washington State) was published. The report estimates that the cost to produce electricity at CGS is significantly higher than the price at which the same amount of electricity could have been purchased in the wholesale market. The report identifies several reasons why that is so, including: it's the only nuclear plant the owner operates, it is located far from the demand centers where its electricity is consumed, and that the region now has a surplus of renewable generating capacity.
On a final note, a number of industrial-scale solar power plants began delivering power to the grid. These include the 250MW California Valley Solar Ranch (PV); the 130 MW Tenaska Imperial Solar Energy Center South project (PV); and the 392 MW Ivanpah Solar Electric plant (central-tower solar thermal). Summer 2014 ought to be interesting in Southern California — particularly if it's a hot one — as they try to juggle intermittent renewables and natural gas to balance the decommissioning of San Onefre.
[1] Clean up the ash ponds. Install new pollution control equipment for fine particulates and various noxious gases. Potentially, pay a carbon tax or buy emission permits for the CO2. There are a number of reasons to think that coal-fired electricity will cost more in the future.
Georgia Power placed the 900-ton bottom portion of the containment vessel for reactor unit 3 at the Vogtle power plant. Georgia is an ideal example of a state that needs nuclear power. The state has, at least compared to its growing demand, quite limited undeveloped renewable resources suitable for generating electricity. For that matter, the state has little in the way of fuel resources, depending on imports of everything. It always startles me to remember that the 3.5 GW coal-fired Scherer power plant is fueled exclusively with Wyoming Powder River Basin coal that travels 2,100 miles to be burned. Long supply lines can be fragile. The 2011 Mississippi, Missouri, and Ohio river floods were only the most recent to create various disruptions in rail transport from west to east. Nuclear fuel is much more compact and refuelings infrequent.
In a different direction, Southern California Edison notified the Nuclear Regulatory Commission that they will retire the two reactors (half of the California fleet) at the San Onefre power plant rather than repairing them. The plant was taken offline in January, 2012 when excessive wear in steam-generation tubes resulted in the release of a small amount of radioactive steam. Last summer, SCE urged conservation and the California independent system operator gave permission for two gas-fired generators to be temporarily restored to service. Those generators had been shut down as part of a clean-air effort, with plans to replace them with new generators in a better location.
Also in the Western Interconnect, an interesting study of the Columbia Generating Station (the nuclear power plant in Washington State) was published. The report estimates that the cost to produce electricity at CGS is significantly higher than the price at which the same amount of electricity could have been purchased in the wholesale market. The report identifies several reasons why that is so, including: it's the only nuclear plant the owner operates, it is located far from the demand centers where its electricity is consumed, and that the region now has a surplus of renewable generating capacity.
On a final note, a number of industrial-scale solar power plants began delivering power to the grid. These include the 250MW California Valley Solar Ranch (PV); the 130 MW Tenaska Imperial Solar Energy Center South project (PV); and the 392 MW Ivanpah Solar Electric plant (central-tower solar thermal). Summer 2014 ought to be interesting in Southern California — particularly if it's a hot one — as they try to juggle intermittent renewables and natural gas to balance the decommissioning of San Onefre.
[1] Clean up the ash ponds. Install new pollution control equipment for fine particulates and various noxious gases. Potentially, pay a carbon tax or buy emission permits for the CO2. There are a number of reasons to think that coal-fired electricity will cost more in the future.
Friday, January 24, 2014
Western Secession 6 - East vs West in Maps
The broad theme of this series of posts is that a peaceful partition of the US into at least two parts is likely in the middle sort of future (probably more than 25 years, probably less then 50). The particular partition that I think about is East and West. Previously, I argued that there is a natural geographic dividing line between the two: the Great Plains region, already pretty empty of people and generally getting emptier. This post puts up a whole pile of maps in order to argue that there are fundamental differences in the situations faced by the East and the West, which will in turn lead them to want to take different paths in solving some problems. When those differences become more important than the similarities between the regions, separation becomes a viable option. Some of these maps have appeared in earlier posts.
Mountains. One-third of the 48 contiguous states isn't like the other two-thirds, as shown in the relief map to the left. From the western edge of the Great Plains to the Pacific Ocean, the terrain is dominated by
mountains. East of the Great Plains, not nearly so much. The highest point east of the Great Plains is only 1,400 feet higher than my house in a Denver suburb; I routinely make up that difference on "easy" hikes up into the foothills. The difference in terrain has a number of consequences, as discussed along with the next several maps. Map credit: U.S. Geological Survey. (Aside: I love this map. The USGS says it's based on 12 million elevation data points extracted from their topo maps. A 56"x36" paper version is available for $12.00.)
Settlement patterns. The mountainous terrain dictated where people could settle (and continues to do so today). Steep gradients mean that the rivers are generally not navigable over long distances (the Columbia being a limited exception). Areas where agriculture is practical, and often the types of agriculture, are limited (the growing season at altitude can be remarkably short). All of this dictates where significant numbers of people can live, summarized in the population map to the left (each white dot represents 7500 people). In some ways, the western part of the US is more urban than the eastern part. Not in the sense of tall buildings and small apartments, but rather that a larger majority of the people live in the urban and suburban areas of a few metro areas. Metro areas are fewer, and much farther apart. The spaces between metro areas are empty in a way that occurs rarely in the eastern part of the country. Map credit: U.S. Census Bureau.
Transportation. The map to the left shows truck freight volume by federal highway route for 2007, with thicker lines indicating more tonnage. Just as the terrain had a large influence on where sizable cities are possible in the West, mountain ranges (and more importantly mountain passes) dictate where most of the transportation routes must run. In some cases, the routes today are the same routes that wagon trains used when they headed out across the Great Plains headed for the West Coast. Wyoming's South Pass is the only sane place for a busy freight route to cross the Rockies between Colorado and the Canadian border. Implicit in this map is that a good deal of the east-west traffic involves transport between the coastal port cities and the more heavily populated East. Rail freight volumes show a similar pattern, with the addition of a huge-volume route headed east from northwestern Wyoming. That route carries very large shipments of Powder River Basin low-sulfur coal to eastern power plants. Map credit: U.S. Department of Transportation.
Federal land holdings. As a result of the settlement patterns and timing (the federal government made an enormous change in public land policy around 1900), the federal government has very large land holdings in the western states. The cartogram to the left, where states have been resized to represent the area owned by the feds, illustrates the point. This has made life difficult for state governments in many ways. Policies affecting a variety of things — some not immediately obvious — can be difficult to manage when the largest landowner in the state (about 40% of the land, on average) is free to simply ignore the state law and do what it pleases. Local resentment towards federal ownership rises and falls in cycles, and seems to be on the upswing again in recent years. Map credit: author's own work, using a wrapper around Mark Newman's highly useful cart and interp programs.
Water. Precipitation west of the Great Plains is very low compared to the areas on the east side. The areas with the heaviest precipitation are, for the most part, mountain ranges or valleys between ranges where the water falls as snow in the winter. Agriculture in the West has always been about storage and management of water. As Mark Twain is famously credited for saying, "Whiskey is for drinking; water is for fighting over." Irrigation is important even in the Pacific Northwest, which appears to be much wetter, due to seasonal variations. During the critical growing months of July and August, Seattle and Portland are as dry or drier than Phoenix and Denver. Phoenix and Denver get summer rainfall from thunderstorms triggered by the North American Monsoon that does not reach Oregon or Washington. The prime irrigation example is California's Central Valley: with irrigation it is perhaps the richest farming area in the world; without irrigation, it's a semi-arid near-desert. Map credit: Oregon Climate Service.
Fire. The last three maps painted a picture of a West that is sparsely settled, dry, and with large areas held by the federal government, much in the form of undeveloped national forests and wilderness areas. That's a nice prescription for wildfires. Fire is, in fact, a natural part of many western ecosystems. For example, some tree species have evolved so that the heat of a fire (which burns off brush and grass that would compete with the seedlings) is required to release their seeds. The map to the left illustrates the number of wildfires from 1980 to 2003 that covered more than 250 acres individually. 250 acres is, by western standards, a small fire. In most recent years, at least one western wildfire has reached at least 100,000 acres. Some have been several times that large. Western wildfires have become much more dangerous and damaging in recent years, though, in part due to misguided fire-suppression policies on federal land during the first half of the 20th century that allowed huge amounts of fuel to accumulate. Map credit: U.S. National Aeronautics and Space Administration.
US electric power grids. The next few posts in this series are going to talk about electricity. I make no bones about it — I believe that managing the transition from fossil-fuel powered electricity generation to something else, in quantities sufficient to support modern tech, is the public policy problem for the next 50 years. The US power grid is actually three grids that are largely independent, illustrated in the map to the left. The dividing line between the Western Interconnect and the others falls largely within the Great Plains. That division isn't surprising. Historically, the three grids grew out of the connections between large utilities, and the Great Plains are a wide (and expensive) barrier for long-distance high-capacity power connections to cross. In addition, much of the Plains region is served by rural electric cooperatives rather than larger utilities. The important point to make here is that the two large interconnects are managed separately. Map credit: Real Energy Services blog.
Solar and onshore wind renewable energy resources. I also believe that there will be important differences in opinion on how to solve the supply problem, divided largely along the line between the Eastern and Western Interconnects (or down the middle of the Great Plains, or between mountain and non-mountain, wet-vs-dry, or any of several other factors that all yield much the same result). Think of it in terms of the answers to the question, "Where will the non-fossil-fuel supply of electricity come from?" Wind and solar (and conventional hydro) are renewable sources with large potential. The map to the left shows where good on-shore wind and solar resources occur in the US. Basically, from the Great Plains west. Another important east-west difference is that many of the best resources in the West are relatively close to major population centers. This map doesn't include conventional hydroelectricity; compared to demand, the large share of undeveloped hydro also falls in the Western Interconnect. Map credit: Recycled Energy blog, using data from the National Renewable Energy Laboratory.
Nuclear power plants. The other large existing source of non-fossil source that is commonly discussed is nuclear fission (commercial fusion has been 30 years away for the last 60 years, and according to the ITER time table, still is). The map to the left shows the location of all commercial power reactors in the US. Fission power is very much an eastern thing. There were never a large number of reactors in the Western Interconnect, and the number has been steadily declining: the Ft. Saint Vrain generating station in Colorado, the Trojan station in Oregon, and most recently the pair of San Onefre reactors in California have been decommissioned. The highlighted reactor is the Columbia Generating Station located on the Hanford Nuclear Reservation in Washington, which was the subject of a recent analysis whose conclusions were quite negative. One might think that the geographic distribution would also make the problem of storing long-lived nuclear waste largely an eastern thing, but political power has — so far — dictated that waste burial will be consigned to the West. The overall distribution of fission plants seems unlikely to change; all of the proposed new reactors that have reached the NRC license review stages are located in the Eastern or Texas Interconnects. Map credit:
McCullough Research's Economic Analysis of the
Columbia Generating Station.
Mountains. One-third of the 48 contiguous states isn't like the other two-thirds, as shown in the relief map to the left. From the western edge of the Great Plains to the Pacific Ocean, the terrain is dominated by
mountains. East of the Great Plains, not nearly so much. The highest point east of the Great Plains is only 1,400 feet higher than my house in a Denver suburb; I routinely make up that difference on "easy" hikes up into the foothills. The difference in terrain has a number of consequences, as discussed along with the next several maps. Map credit: U.S. Geological Survey. (Aside: I love this map. The USGS says it's based on 12 million elevation data points extracted from their topo maps. A 56"x36" paper version is available for $12.00.)
Settlement patterns. The mountainous terrain dictated where people could settle (and continues to do so today). Steep gradients mean that the rivers are generally not navigable over long distances (the Columbia being a limited exception). Areas where agriculture is practical, and often the types of agriculture, are limited (the growing season at altitude can be remarkably short). All of this dictates where significant numbers of people can live, summarized in the population map to the left (each white dot represents 7500 people). In some ways, the western part of the US is more urban than the eastern part. Not in the sense of tall buildings and small apartments, but rather that a larger majority of the people live in the urban and suburban areas of a few metro areas. Metro areas are fewer, and much farther apart. The spaces between metro areas are empty in a way that occurs rarely in the eastern part of the country. Map credit: U.S. Census Bureau.
Transportation. The map to the left shows truck freight volume by federal highway route for 2007, with thicker lines indicating more tonnage. Just as the terrain had a large influence on where sizable cities are possible in the West, mountain ranges (and more importantly mountain passes) dictate where most of the transportation routes must run. In some cases, the routes today are the same routes that wagon trains used when they headed out across the Great Plains headed for the West Coast. Wyoming's South Pass is the only sane place for a busy freight route to cross the Rockies between Colorado and the Canadian border. Implicit in this map is that a good deal of the east-west traffic involves transport between the coastal port cities and the more heavily populated East. Rail freight volumes show a similar pattern, with the addition of a huge-volume route headed east from northwestern Wyoming. That route carries very large shipments of Powder River Basin low-sulfur coal to eastern power plants. Map credit: U.S. Department of Transportation.
Federal land holdings. As a result of the settlement patterns and timing (the federal government made an enormous change in public land policy around 1900), the federal government has very large land holdings in the western states. The cartogram to the left, where states have been resized to represent the area owned by the feds, illustrates the point. This has made life difficult for state governments in many ways. Policies affecting a variety of things — some not immediately obvious — can be difficult to manage when the largest landowner in the state (about 40% of the land, on average) is free to simply ignore the state law and do what it pleases. Local resentment towards federal ownership rises and falls in cycles, and seems to be on the upswing again in recent years. Map credit: author's own work, using a wrapper around Mark Newman's highly useful cart and interp programs.
Water. Precipitation west of the Great Plains is very low compared to the areas on the east side. The areas with the heaviest precipitation are, for the most part, mountain ranges or valleys between ranges where the water falls as snow in the winter. Agriculture in the West has always been about storage and management of water. As Mark Twain is famously credited for saying, "Whiskey is for drinking; water is for fighting over." Irrigation is important even in the Pacific Northwest, which appears to be much wetter, due to seasonal variations. During the critical growing months of July and August, Seattle and Portland are as dry or drier than Phoenix and Denver. Phoenix and Denver get summer rainfall from thunderstorms triggered by the North American Monsoon that does not reach Oregon or Washington. The prime irrigation example is California's Central Valley: with irrigation it is perhaps the richest farming area in the world; without irrigation, it's a semi-arid near-desert. Map credit: Oregon Climate Service.
Fire. The last three maps painted a picture of a West that is sparsely settled, dry, and with large areas held by the federal government, much in the form of undeveloped national forests and wilderness areas. That's a nice prescription for wildfires. Fire is, in fact, a natural part of many western ecosystems. For example, some tree species have evolved so that the heat of a fire (which burns off brush and grass that would compete with the seedlings) is required to release their seeds. The map to the left illustrates the number of wildfires from 1980 to 2003 that covered more than 250 acres individually. 250 acres is, by western standards, a small fire. In most recent years, at least one western wildfire has reached at least 100,000 acres. Some have been several times that large. Western wildfires have become much more dangerous and damaging in recent years, though, in part due to misguided fire-suppression policies on federal land during the first half of the 20th century that allowed huge amounts of fuel to accumulate. Map credit: U.S. National Aeronautics and Space Administration.
US electric power grids. The next few posts in this series are going to talk about electricity. I make no bones about it — I believe that managing the transition from fossil-fuel powered electricity generation to something else, in quantities sufficient to support modern tech, is the public policy problem for the next 50 years. The US power grid is actually three grids that are largely independent, illustrated in the map to the left. The dividing line between the Western Interconnect and the others falls largely within the Great Plains. That division isn't surprising. Historically, the three grids grew out of the connections between large utilities, and the Great Plains are a wide (and expensive) barrier for long-distance high-capacity power connections to cross. In addition, much of the Plains region is served by rural electric cooperatives rather than larger utilities. The important point to make here is that the two large interconnects are managed separately. Map credit: Real Energy Services blog.
Solar and onshore wind renewable energy resources. I also believe that there will be important differences in opinion on how to solve the supply problem, divided largely along the line between the Eastern and Western Interconnects (or down the middle of the Great Plains, or between mountain and non-mountain, wet-vs-dry, or any of several other factors that all yield much the same result). Think of it in terms of the answers to the question, "Where will the non-fossil-fuel supply of electricity come from?" Wind and solar (and conventional hydro) are renewable sources with large potential. The map to the left shows where good on-shore wind and solar resources occur in the US. Basically, from the Great Plains west. Another important east-west difference is that many of the best resources in the West are relatively close to major population centers. This map doesn't include conventional hydroelectricity; compared to demand, the large share of undeveloped hydro also falls in the Western Interconnect. Map credit: Recycled Energy blog, using data from the National Renewable Energy Laboratory.
Saturday, January 18, 2014
The Internet of Things
One of my friends makes an annual pilgrimage to Las Vegas for the Consumer Electronics Show, walks tens of miles of the convention floor aisles, and sends out a sometimes serious, sometimes tongue-in-cheek review of the overall theme. This year he reports that it's "the internet of things." The chip makers are producing the hardware to make it cheap to embed a processor, wifi, and IP stack; the consumer companies are racing to put the hardware into everything you can imagine (and some that I certainly didn't).
Interestingly, along with his e-mail announcing this year's report, I found an article from the BBC about a smart refrigerator that had been hacked and included in a spam-bot network [1]. I suspect that this is just the beginning of the problem. As the article notes, security is probably not high on the list of features the consumer electronics firms are working on. After all, security is hard, it's largely invisible (except when it's annoyingly visible), and who's going to buy their smart refrigerator based on how secure it is? Like most consumer things that have become smart, it's going to be all about screen real estate and the size of the app store.
This is a subject that I thought about a lot in a previous career. I have a patent for a software architecture that allowed smart devices (cable television set-top boxes specifically) to live behind a stout firewall and extend limited functionality to the Internet in a controlled manner. Because even back then I was really afraid about the damage that could be done to the devices and that the devices could do if they were just attached transparently to the Internet. Even with this sort of protection, having lots of relatively simple-minded devices running in my house was a scary thought. Part of my job was finding ways to use little cracks in a firewall to implement gross security breaches. It's amazing what you can do if you can get the right one piece of software to run on something behind the firewall. Given enough devices behind my home's firewall, especially if some of those devices are portable and get attached to other networks occasionally, somebody is going to figure out a way to get that first piece of code in place.
On a lighter note, smarts are going to be embedded in things we wear as well. I eagerly await reports of the first celebrity wardrobe malfunction that gets blamed on "somebody hacked the clothing."
[1] The use of a picture of a Samsung smart refrigerator should not be taken to indicate that the hacked refrigerator was a Samsung product, or that Samsung refrigerators' security is either better or worse than that of any other smart appliance. It's just a convenient picture.
Interestingly, along with his e-mail announcing this year's report, I found an article from the BBC about a smart refrigerator that had been hacked and included in a spam-bot network [1]. I suspect that this is just the beginning of the problem. As the article notes, security is probably not high on the list of features the consumer electronics firms are working on. After all, security is hard, it's largely invisible (except when it's annoyingly visible), and who's going to buy their smart refrigerator based on how secure it is? Like most consumer things that have become smart, it's going to be all about screen real estate and the size of the app store.
This is a subject that I thought about a lot in a previous career. I have a patent for a software architecture that allowed smart devices (cable television set-top boxes specifically) to live behind a stout firewall and extend limited functionality to the Internet in a controlled manner. Because even back then I was really afraid about the damage that could be done to the devices and that the devices could do if they were just attached transparently to the Internet. Even with this sort of protection, having lots of relatively simple-minded devices running in my house was a scary thought. Part of my job was finding ways to use little cracks in a firewall to implement gross security breaches. It's amazing what you can do if you can get the right one piece of software to run on something behind the firewall. Given enough devices behind my home's firewall, especially if some of those devices are portable and get attached to other networks occasionally, somebody is going to figure out a way to get that first piece of code in place.
On a lighter note, smarts are going to be embedded in things we wear as well. I eagerly await reports of the first celebrity wardrobe malfunction that gets blamed on "somebody hacked the clothing."
[1] The use of a picture of a Samsung smart refrigerator should not be taken to indicate that the hacked refrigerator was a Samsung product, or that Samsung refrigerators' security is either better or worse than that of any other smart appliance. It's just a convenient picture.
Wednesday, January 15, 2014
Western Secession 5 - The Great (Plains) Divide
In a previous post, I wrote about some of the ways that people have proposed partitioning the United States (in relatively large chunks; a future post will discuss why I'm not interested in proposals to carve off little pieces). This time, I'm going to lay some groundwork for a geography-based partition that is seldom considered. The mesh-based population cartogram shown in this post suggests the starting point.
The Great Plains region occupies portions of ten states. In the upper map to the left, the Great Plains counties in those states are shown in white, and the remaining portions of the states in various colors. I've intentionally left out any state boundaries within the white area in order to emphasize the point that I'm writing about a situation that is regional rather than state-based.
There have been a lot of different definitions of the Great Plains over the years [1], so it's worth saying where this one came from. I started with the US Census Bureau's publication Population Dynamics of the Great Plains: 1950 to 2007 [2]. Then I removed seven counties from the Front Range area of Colorado and four counties from the Austin area in Texas. I had three reasons for trimming out those 11 counties: (1) they sit on the periphery of the Plains and different standards might or might not include them; (2) they have grown enormously in population for reasons that have nothing to do with the Plains; and (3) that large population growth doesn't fit my narrative. Sometimes there are just outliers in the data that should be excluded.
The Great Plains as shown here is a large region: somewhat smaller than Alaska but almost twice the size of Texas; 50% larger than the Pacific Coast states of California, Oregon, and Washington combined; 20% larger than the 15 Atlantic Coast states combined. The Great Plains are also quite empty, at least so far as people go. The lower cartogram resizes each county based on its population. The Plains don't exactly disappear, but they become a narrow strip. The strip is less narrow at the north and south ends, where there are large fossil-fuel deposits that have been or are being developed. Many parts of the Plains are getting emptier as time goes on, with populations that are shrinking in absolute terms.
This is a long-term trend; the Census Bureau document mentioned above identifies a large number of counties whose population peaked more than 80 years ago. Nor is the population situation likely to reverse itself. Agriculture has become increasingly mechanized, requiring fewer people. The same is true for the energy resources that occur in some parts of the Plains: it doesn't take a lot of people to extract a million tons of coal from a Wyoming surface mine, or to maintain a large wind farm, or to drill the oil wells in the Bakken area of North Dakota. Generally speaking, the area lacks the kinds of infrastructure that would attract businesses that aren't concerned with natural resources. In many cases, the infrastructure -- in the sense of services like medical care or education -- are declining.
In a future where distance becomes more important than it is today, the wide, empty expanse of the Great Plains is a natural dividing line between eastern and western parts of the country. Always keep in mind the scale of things: the width of the Plains ranges from 250 to about 550 miles. Compared to the Boston-to-Washington, DC megalopolis, the Plains have eight times the area but only one-tenth the population. Even in a local comparison, the bulk of the Front Range population -- the large yellow bulge on the cartogram -- lives in a strip 30 miles or so wide on that portion of the Plains immediately adjacent to the Rocky Mountain foothills.
The next question to consider is "Are there important differences in the two parts of the country separated by the Great Plains?" In the next post in this series, I'll show a variety of such differences.
[1] There has always been some uncertainty about the dividing line between wetter, lower-altitude prairie and the drier, higher Great Plains. Some cartographers extend the Plains much farther to the east, including parts of Minnesota and Iowa. Some definitions also stop the Plains on the south end before they reach the Rio Grande, asserting that that area becomes so dry that it should be categorized as desert.
[2] Unlike some works, in this one the authors did not include a list of which counties they had chosen. That's a shame, given that there are FIPS (federal information processing standards) codes for every county and county-equivalent in the country, and lots of useful data indexed by FIPS code. I generated my list after a relatively miserable afternoon spent with Figure 6 from the publication and some other information sources.
The Great Plains region occupies portions of ten states. In the upper map to the left, the Great Plains counties in those states are shown in white, and the remaining portions of the states in various colors. I've intentionally left out any state boundaries within the white area in order to emphasize the point that I'm writing about a situation that is regional rather than state-based.
There have been a lot of different definitions of the Great Plains over the years [1], so it's worth saying where this one came from. I started with the US Census Bureau's publication Population Dynamics of the Great Plains: 1950 to 2007 [2]. Then I removed seven counties from the Front Range area of Colorado and four counties from the Austin area in Texas. I had three reasons for trimming out those 11 counties: (1) they sit on the periphery of the Plains and different standards might or might not include them; (2) they have grown enormously in population for reasons that have nothing to do with the Plains; and (3) that large population growth doesn't fit my narrative. Sometimes there are just outliers in the data that should be excluded.
The Great Plains as shown here is a large region: somewhat smaller than Alaska but almost twice the size of Texas; 50% larger than the Pacific Coast states of California, Oregon, and Washington combined; 20% larger than the 15 Atlantic Coast states combined. The Great Plains are also quite empty, at least so far as people go. The lower cartogram resizes each county based on its population. The Plains don't exactly disappear, but they become a narrow strip. The strip is less narrow at the north and south ends, where there are large fossil-fuel deposits that have been or are being developed. Many parts of the Plains are getting emptier as time goes on, with populations that are shrinking in absolute terms.
This is a long-term trend; the Census Bureau document mentioned above identifies a large number of counties whose population peaked more than 80 years ago. Nor is the population situation likely to reverse itself. Agriculture has become increasingly mechanized, requiring fewer people. The same is true for the energy resources that occur in some parts of the Plains: it doesn't take a lot of people to extract a million tons of coal from a Wyoming surface mine, or to maintain a large wind farm, or to drill the oil wells in the Bakken area of North Dakota. Generally speaking, the area lacks the kinds of infrastructure that would attract businesses that aren't concerned with natural resources. In many cases, the infrastructure -- in the sense of services like medical care or education -- are declining.
In a future where distance becomes more important than it is today, the wide, empty expanse of the Great Plains is a natural dividing line between eastern and western parts of the country. Always keep in mind the scale of things: the width of the Plains ranges from 250 to about 550 miles. Compared to the Boston-to-Washington, DC megalopolis, the Plains have eight times the area but only one-tenth the population. Even in a local comparison, the bulk of the Front Range population -- the large yellow bulge on the cartogram -- lives in a strip 30 miles or so wide on that portion of the Plains immediately adjacent to the Rocky Mountain foothills.
The next question to consider is "Are there important differences in the two parts of the country separated by the Great Plains?" In the next post in this series, I'll show a variety of such differences.
[1] There has always been some uncertainty about the dividing line between wetter, lower-altitude prairie and the drier, higher Great Plains. Some cartographers extend the Plains much farther to the east, including parts of Minnesota and Iowa. Some definitions also stop the Plains on the south end before they reach the Rio Grande, asserting that that area becomes so dry that it should be categorized as desert.
[2] Unlike some works, in this one the authors did not include a list of which counties they had chosen. That's a shame, given that there are FIPS (federal information processing standards) codes for every county and county-equivalent in the country, and lots of useful data indexed by FIPS code. I generated my list after a relatively miserable afternoon spent with Figure 6 from the publication and some other information sources.
Subscribe to:
Posts (Atom)