Monday, February 21, 2011

What does structural unemployment say about job openings and hires

Structural unemployment seems to imply that there are job openings that do not get filled. So why does JOLTS data (above) consistently show that hires are always higher than openings?

It's a compositional effect, see chart below which shows the same for durable goods manufacturing where openings are now running higher than hires:

Over-qualified for a job and structural unemployment theories

Every recession a story will sometimes appear in the media about how openings go unfilled even though applicants are more than qualified for the job. Here's one:

Vineland resident Joseph Sangataldo has applied for several jobs, not counting civil service positions, since his layoff last October. ... The 53-year-old college graduate has spent much of his adult life in the public sector in employment, public health and social services.

Naked Capitalism lays out some reasons why this might be the case (from an employer's perspective):

... as cost cutting and short term earnings fixation became more pervasive, average time of employment shortened greatly. And with that came a major shift in behavior: it made less and less sense for employers to hire talented people with good general competence and character and train them. They’d be unlikely to recoup the cost of the investment ...

Which seems to imply:
1. Even though the cost of training highly educated workers may be lower, they don't expect the worker to stay once the economy turns around.
2. Which in turn implies that they don't expect the recession long enough for them to break even on the cost of training and hiring.

But this recession seems different as pointed out in the NYT:

Don Carroll, a former financial analyst with a master’s degree in business administration from a top university, was clearly overqualified for the job running the claims department for Cartwright International, a small, family-owned moving company here south of Kansas City. ... Conventional wisdom warns against hiring overqualified candidates like Mr. Carroll, who often find themselves chafing at their new roles. ... A result is a new cadre of underemployed workers dotting American companies, occupying slots several rungs below where they are accustomed to working. These are not the more drastic examples of former professionals toiling away at “survival jobs” at Home Depot or Starbucks. They are the former chief financial officer working as comptroller, the onetime marketing director who is back to being an analyst, the former manager who is once again an “individual contributor.”

But some of the difficulties are adjustment of expectations on the part of the overqualified employee:

Mr. Carroll’s cubicle mate, Mindy William, a former graphic designer and single mother who had been working at Target before she was recently hired as a claims adjuster, said she had noticed that he seemed to talk about his old job a lot.

“I know it’s been an adjustment for him,” she said. “He’s just making the best of it like the rest of us are. We’re glad to have jobs in this recession.”

For his part, Mr. Carroll admitted that he had caught himself often trying to drop his credentials into conversations at his new workplace.

“Obviously that stems from maybe some embarrassment at the level that I’m at,” he said. “I do want people to know that, to some extent, this isn’t who I am.”

Is over-qualification or under-employment consistent with structural unemployment - this is the structural unemployment argument that equates skill with education level? It depends on the rate at which the over-qualified workers find jobs. The more of these under-employed workers there are means a strike against the structural unemployment argument. It also means that workers are flexible enough (in the classical economics sense) to accept lower wages at a different but perhaps related job. It may even argue for higher industrial and occupational mobility during recessions.

Sunday, February 20, 2011

Do we know what we don't know

A few items over the course of the past few months gives pause to what we think we know or don't know.

1. Fluoride, we know has beneficial effects - but how much fluoride is better? This news item reverses what we thought we knew:

The U.S. Department of Health and Human Services announced plans Friday to lower the recommended level of fluoride in drinking water for the first time in nearly 50 years, based on a fresh review of the science.

The announcement is likely to renew the battle over fluoridation, even though the addition of fluoride to drinking water is considered one of the greatest public health successes of the 20th century. The U.S. prevalence of decay in at least one tooth among teens has declined from about 90 percent to 60 percent.

The government first began urging municipal water systems to add fluoride in the early 1950s. Since then, it has been put in toothpaste and mouthwash. It is also in a lot of bottled water and in soda. Some kids even take fluoride supplements. Now, young children may be getting too much. ...

One reason behind the change: About 2 out of 5 adolescents have tooth streaking or spottiness because of too much fluoride, a government study found recently. In extreme cases, teeth can be pitted by the mineral — though many cases are so mild only dentists notice it. The problem is generally considered cosmetic and not a reason for serious concern.

The splotchy tooth condition, fluorosis, is unexpectedly common in youngsters ages 12 through 15 and appears to have grown more common since the 1980s, according to the Centers for Disease Control and Prevention.

But there are also growing worries about more serious dangers from fluoride.

The Environmental Protection Agency released two new reviews of research on fluoride Friday. One of the studies found that prolonged, high intake of fluoride can increase the risk of brittle bones, fractures and crippling bone abnormalities.

Critics of fluoridated water seized on the proposed change Friday to renew their attacks on it — a battle that dates back to at least the Cold War 1950s, when it was denounced by some as a step toward Communism. Many activists nowadays don't think fluoride is essential, and they praised the government's new steps.

See also this SciAm article (subscribers only).

2. This SciAm (subscribers only) article on vitamin D was so convincing that not long after, I went out and got myself some vitamin D supplements. It is unusually convincing (to me) and it is hard to place a reason for this - especially since there are no randomized trials involved. Theoretical links between vitamin D and health were explored and found to be true. There were no size effects and least of all, very little on recommended dosage. Perhaps it was because both the authors were so convinced that they are now on vitamin D supplements as well (each of them on a different dose!).

Not long after reading that, Freakonomics reports on two apparently conflicting headlines on vitamin D by the NYT and WSJ. However, the headlines in these cases actually tell the same story:

A long-awaited report from the Institute of Medicine to be released Tuesday triples the recommended amount of vitamin D most Americans should take every day to 600 international units from 200 IUs set in 1997.

Given that the so-called 'hard' sciences are struggling with the dosage issue, it was refreshing that economists were able to say that they know what they don't know. Here is Mankiw on extending unemployment benefits:

So when I hear economists advocate the extension of UI to 99 weeks, I am tempted to ask, would you also favor a further extension to 199 weeks, or 299 weeks, or 1099 weeks? If 99 weeks is better than 26 weeks, but 199 is too much, how do you know?

It is plausible to me that UI benefits should last longer when the economy is weak. The need for increased aggregate demand is greater, and the impact on job search may be weaker. But this conclusion is hardly enough to tell us whether 99 weeks is too much, too little, or about right. It is also conceivable that the amount of UI offered in normal times is higher than optimal and that a further extension would move us farther from what is desirable.

Likewise, given what economists know, can policies that target inflation at 3 percent is better than one that targets inflation at 4 percent?

Saturday, February 19, 2011

Firefox not warning when closing multiple tabs

I had this problem when I upgraded to Firefox 3.6 and there were all these discussions on changing the settings - none of which worked. After fiddling around I accidentally discovered that if I set my Privacy settings to "Never Remember History" it would always close multiple tabs without warning regardless of what other settings for tabs are set. I got around to changing my Privacy settings to clear history on close.

Finally, one aggravation solved.


When I think of standards, I don't usually think of this. I can't make my mind up about this - is it a misallocation of resources to set standards on fruits and vegetables? Is it necessary? I don't know what to think.

Structural unemployment - skills revisited again

Anecdotal stories such as this are important but at the same time can serve to undermine. The report:

Evidence of a skills mismatch became increasingly clear in Fresno after the housing bubble burst, causing joblessness to nearly triple. ... Unemployment hovers at 16.9 percent, but managers at the 7,000-employee Community Medical Centers say they cannot find enough qualified technicians, therapists, or even custodians willing and able to work with medical waste.

The situation is much the same at Jain Irrigation, which cannot find all the workers it wants for $15-an-hour jobs running expensive machinery that spins out precision irrigation tubing at 600 feet a minute, 24 hours a day, seven days a week.

"The job requires at least a high school education, and maybe some technical training, but we don't seem to be getting the right people applying," said Aric J. Olson, Jain's president.

Perhaps the it was the bubble that caused workers to reallocate their skills to flipping properties and hustling the unsuspecting public to apply for NINJA loans instead of accumulating skills? A comparison of whether it was difficult to fill positions before the recession would have been at least the minimum to allow a very tentative conclusion of skill-mismatched.

Yet the title of the article states the problem most succinctly: Why does Fresno have thousands of job openings - and high unemployment? And the answer seems to lie in the above quote - at least a high school graduation and some technical training. Perhaps the trend in HS graduation rate that was evident even before the recession is part of the answer. See Heckman and LaFontaine (2010) for evidence that HS graduation rates have been steadily declining, and this NCES report for evidence of the opposite.

As far as Fresno is concerned, the facts on the ground seem to indicate a lack of HS graduates and as argued previously, the lack of education is not the same as the lack of skills.

Stagnation in board games, computers, and toys

I've been making numerous trips to the toy store in search of gifts for the kids friends - birthday parties and such, and I have been depressed by the lack of interesting board games and toys.

In terms of board games, it is hard to find anything new at all that has new and successful since I was a kid. I still see Risk, Monopoly, and the Game of Life and the 'newer' stuff seems to be variations of the older games - Horseopoly, Dogopoly, Catopoly. Card games are all variations of some kind of gin rummy and other games are offshoots of dominoes or dice-throwing. In some ways it attests to the fact that these games are long-lived and hence a sign of success. In other ways it shows lack of imagination and innovation.

When I was a kid, a game we enjoyed was Mastermind which I did get but was extremely disappointed in the quality. The pegs didn't really fit and everything felt really cheap and plasticky - even though it was a fun game once, the cheapness of how it was made really took away all the pleasure.

And the poor quality isn't only reflected in toys. I recently bought a HP computer and was extremely disappointed in its cheap plastic casings and rickety feeling keyboard. Even though it has a fairly fast dual processor it was much louder than I expected. The last computer I bought was a Dell which was almost 6 years ago which was just as poor a quality and for that reason I avoided Dell altogether. I'll have to add HP to the list. While, it is true that computers have seen innovations, its design, appeal, and look-and-feel have stagnated and may have even taken a dive.

In many ways, I wonder if Tyler Cowen has understated the stagnation in our lives.

Comparing recessions: Job Openings

As an attempt at continuing on this series, this post compares job openings. Previous installment on job losses and layoffs is here.

In terms of levels, the chart shows that indeed, job openings in this recession seem very different from the previous recession. The sectors that appear to be recovering are manufacturing, and finance and insurance. The information and mining/logging sectors were not particularly different than the previous recession but all the other sectors appear to have experienced a shift toward lower levels of job openings.

The two different recessions are broken out below:

Openings this recession trended lower over the period, most likely because of its duration compared to the 2001 recession. In terms of job openings over time they were similar if we only make the comparison over the first 4 months of the recession.

Using rates instead of levels seem to tell a different story, in particular, this recession does not appear that much more severe in terms of a decrease in job opening rates compared to the previous recession.

While, there is no doubt that the job opening rates were lower this recession than in the previous recession (see next two charts), the above indicates that except for leisure and hospitality, and education and health services (a sector that one would have thought would be somewhat indifferent to the recession) the job opening rates are almost back to its pre-recession rates.

Friday, February 18, 2011

Structural unemployment III - Skills

Another argument proposed by advocates of structural unemployment is as follows - this is the so-called zero marginal product workers argument, which runs roughly as follows:

Workers were laid off during the recession but over time output and profits have rebounded with not gain in employment (or decrease in unemployment). It is a structural-based argument because these workers who were hired during the boom times were not really needed and are generally low-skilled. Now that times are lean, firms have learned how to do without them and thus are unlikely to be re-hired.

This argument has some merit if the measure of skill is education. But it isn't - skill is not the same as education. It is possible to broaden the definition of structural to include education and if it is the case then structural unemployment advocates have a point in believing that AD-based policies will be less effective since the prescription calls for a more educated work force which takes time.

However, it remains to be seen if the unemployment rates of the less educated in this recession are indeed higher than in the previous recession and whether they are indeed less likely to be hired during the recovery phase.

Tuesday, February 15, 2011

Structural unemployment - Skills revisited

In a previous post I deferred to ONET's usage/definition of skill when throwing out an example. But when economists think of skills, the list in ONET is not I would think of as a skill. For instance, knowledge of Fortran or C++ would be considered a skill and a skill mismatch would arise if there were a lot of Fortran programmers looking for work when employers are demanding C++ programmers.

What would be considered skills for a programmer in this case is listed in ONET as tasks:
  • Write, update, and maintain computer programs or software packages to handle specific jobs such as tracking inventory, storing or retrieving data, or controlling other equipment.
  • Write, analyze, review, and rewrite programs, using workflow chart and diagram, and applying knowledge of computer capabilities, subject matter, and symbolic logic.
Or Tools/Technology:
  • Data base management system software — Microsoft SQL Server; MySQL software; Oracle procedural language/structured query language PL/SQL; Pick software
  • Data base user interface and query software — dBase Plus; IEA Software Emerald; Microsoft Access; Structured query language SQL
  • Development environment software — C; Microsoft Visual Basic; Tier generator software; Xerces2 Java Parser
  • Object or component oriented development software — C++; Greatis Object Inspector; PowerSoft PowerBuilder; Sun Microsystems Java
  • Web platform development software — Hypertext markup language HTML; JavaScript; Microsoft Visual C#; Progress WebSpeed Workshop
In the skills category, the following is listed (partial list only):
  • Programming — Writing computer programs for various purposes.
  • Reading Comprehension — Understanding written sentences and paragraphs in work related documents.
  • Complex Problem Solving — Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.
There appear to be at least two ways of thinking about skills - a broad set as ONET has listed above or in terms of Tools and Technology. The broader definition is something that is more transeferable while the second is not. When economists speak of a skill mismatch the second narrow definition is the one we are using.

Another way to think of skill mismatch is a change in job definition or tasks. Employers may for whatever reason expand the role of a programmer to include also other tasks such as those of database administrators. For whatever reason that this may occur, it can also be considered a skill mismatch because programmers are not necessarily trained as database administrators. (I could throw out a ridiculous example of employers wanting laborers who also happen to be able to play the harmonica.)

But whichever definition of skills is used, the argument that the current high unemployment rate is a result of a skill mismatch is not based on any evidence that a shift in demand of skills has taken place due to technology adoption (unlike the computerization era of the 1980s.) or some other as yet unidentified shift in demand

Sunday, February 13, 2011

Convert pre-2002 CPS industry and occupation codes

The idea is straightforward. Most of the pre-2002 occupations have been redistributed into other occupations post-2002. In order to get to the new codes generate a random number over the entire data set. If the old occupation has a random number within the range defined by Census then we set this to be its new code.

The rules for the proportions of old occupations and industries to be redistributed to the new codes are available at the Census web site or at IPUMS. The spreadsheets containing the rules for industry were downloaded from here while the rules for the occupations were downloaded from here.

The caveats are that the porportions are assumed to remain constant over time and that the data set is large enough that how the random numbers are generated does not matter 'much'.

The SAS code first converts the occupation spreadsheet to SAS data to be written out to a text file:
DATAFILE= "I:\occ_90-00.xls"

data temp1(drop = Table_2___1990_Census_Occupation f6);
set temp(drop = f5 f7 f8);
rename /* Table_2___1990_Census_Occupation = oldcode*/
f2 = oldlabel f3 = newcode f4 = newlabel ;
if _N_ < 5 then delete;
if index (Table_2___1990_Census_Occupation, 'categor')>0 then delete;
oldcode = input (Table_2___1990_Census_Occupation,4.);
if f6 ne 'NA' then percent = input(f6, 8.);

data temp1(drop = oldcode oldlabel newcode);
retain c clabel;
length clabel $ 65;
set temp1;
if oldcode ne . then c = oldcode;
if oldlabel ne ' ' then clabel = oldlabel;
ncode = input(newcode, 4.);
if percent = . then delete;

data temp2;
retain c ncode;
set temp1(where = (ncode ne .)); by c;
retain cumpct;
if first.c then do;
cumpct = percent;
lpercent = 0;
else do;
lpercent = cumpct;
cumpct = cumpct + percent;

data _null_;
set temp2; by c;
file "I:\CPS\SAS Programs\";
if first.c then do;
if _N_ = 1 then put "if old_code = " c " then do;";
else put "else if old_code = " c " then do;";
put " if " lpercent "< y <= " cumpct " then new_code = " ncode ";";
else if not first.c then do;
put " if " lpercent "< y <= " cumpct " then new_code = " ncode ";";
if last.c then do;
put "end;";

The following does the same for industry:
DATAFILE= "I:\ind_90-00.xls"

data temp1;
set temp(drop = f5 f7 f8);
rename Table_1___1990_Census_Industry_C = oldcode
f2 = oldlabel f3 = newcode f4 = newlabel f6 = percent;
if _N_ < 5 then delete;
if index (Table_1___1990_Census_Industry_C, 'categor')>0 then delete;

data temp1(drop = oldcode oldlabel newcode);
retain c clabel;
length clabel $ 65;
set temp1;
if oldcode ne ' ' then c = input(oldcode, 4.);
if oldlabel ne ' ' then clabel = oldlabel;
ncode = input(newcode, 4.);
if percent = . then delete;

data temp2;
retain c ncode;
set temp1; by c;
retain cumpct;
if first.c then do;
cumpct = percent;
lpercent = 0;
else do;
lpercent = cumpct;
cumpct = cumpct + percent;

data _null_;
set temp2; by c;
file "I:\CPS\SAS Programs\";
if first.c then do;
if _N_ = 1 then put "if old_code = " c " then do;";
else put "else if old_code = " c " then do;";
put " if " lpercent "< x <= " cumpct " then new_code = " ncode ";";
else if not first.c then do;
put " if " lpercent "< x <= " cumpct " then new_code = " ncode ";";
if last.c then do;
put "end;";

Now there are two SAS programs to be included. First a search and replace is used to replace the "old_code" to PEIO1OCD (occupation) and PEIO1ICD (industry) and "new_code" to NEW_OCC and NEW_IND respectively (for example). It would have been simpler to have SAS write them out but I wanted to be able to use this if some other data set used a different variable name and I needed to remember which was old and new. The following is a snippet to do the actual recode:

  call streaminit (6875309&yr);
x = rand('uniform') * 100;
y = rand('uniform') * 100;

%include "";
%include "";

Here's a snippet of what the actual code will look like (for occupation):
if peio1ocd = 3  then do;
if 0 < y <= 100 then peio1ocd_r = 3 ;
else if peio1ocd = 4 then do;
if 0 < y <= 77.143 then peio1ocd_r = 1 ;
if 77.143 < y <= 100 then peio1ocd_r = 43 ;
else if peio1ocd = 5 then do;
if 0 < y <= 1.571 then peio1ocd_r = 1 ;
if 1.571 < y <= 4.712 then peio1ocd_r = 2 ;
if 4.712 < y <= 6.806 then peio1ocd_r = 10 ;
if 6.806 < y <= 8.377 then peio1ocd_r = 11 ;
if 8.377 < y <= 13.613 then peio1ocd_r = 12 ;
if 13.613 < y <= 14.66 then peio1ocd_r = 13 ;
if 14.66 < y <= 16.754 then peio1ocd_r = 15 ;
if 16.754 < y <= 17.278 then peio1ocd_r = 16 ;
if 17.278 < y <= 17.802 then peio1ocd_r = 22 ;
if 17.802 < y <= 19.896 then peio1ocd_r = 23 ;
if 19.896 < y <= 20.42 then peio1ocd_r = 30 ;
if 20.42 < y <= 20.944 then peio1ocd_r = 36 ;
if 20.944 < y <= 21.991 then peio1ocd_r = 41 ;
if 21.991 < y <= 26.703 then peio1ocd_r = 42 ;
if 26.703 < y <= 67.541 then peio1ocd_r = 43 ;
if 67.541 < y <= 71.73 then peio1ocd_r = 54 ;
if 71.73 < y <= 73.824 then peio1ocd_r = 62 ;
if 73.824 < y <= 76.965 then peio1ocd_r = 81 ;
if 76.965 < y <= 77.489 then peio1ocd_r = 84 ;
if 77.489 < y <= 82.725 then peio1ocd_r = 93 ;
if 82.725 < y <= 88.484 then peio1ocd_r = 122 ;
if 88.484 < y <= 89.008 then peio1ocd_r = 164 ;
if 89.008 < y <= 91.102 then peio1ocd_r = 202 ;
if 91.102 < y <= 93.72 then peio1ocd_r = 211 ;
if 93.72 < y <= 94.244 then peio1ocd_r = 215 ;
if 94.244 < y <= 94.768 then peio1ocd_r = 354 ;
if 94.768 < y <= 95.292 then peio1ocd_r = 382 ;
if 95.292 < y <= 95.816 then peio1ocd_r = 395 ;
if 95.816 < y <= 100.005 then peio1ocd_r = 525 ;

Structural unemployment II - Skills

Another definition of structural unemployment is based on skills. In this definition, unemployment is high because employers cannot find workers with the skills that they want. On the other side, job seekers do not have the skills that can do the job even though they want to work. Again, it is difficult to say how relevant this argument in the context of the current recession. If anything, the mismatch in skills is an underlying trend that has gone on before the recession and that the causality runs from skill mismatch to high unemployment rather than recession causing a skill mismatch.

The paucity of data makes it difficult to evaluate whether the current state of the economy is one of skill mismatch. There is little data on the skills demanded by employers. One source that has yet to be collected and mined thoroughly are job ads. The job ads speak directly to skills demanded. Another possible source is ONET but this website is a list of skills that BLS thinks that employers are looking for in an occupation.

For instance, ONET lists the following skills for an accountant:
  • Active Listening — Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times.
  • Mathematics — Using mathematics to solve problems.
  • Reading Comprehension — Understanding written sentences and paragraphs in work related documents.
  • Writing — Communicating effectively in writing as appropriate for the needs of the audience.
  • Critical Thinking — Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems.
  • Speaking — Talking to others to convey information effectively.
  • Judgment and Decision Making — Considering the relative costs and benefits of potential actions to choose the most appropriate one.
  • Complex Problem Solving — Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.
  • Time Management — Managing one's own time and the time of others.
  • Active Learning — Understanding the implications of new information for both current and future problem-solving and decision-making.
Unfortunately ONET is not a reflection of the demand for skills in an economy. One ambitious study that evaluated the skill composition of the labor force is by Autor, Levy and Murnane (1993). The authors use the predecessor to ONET, the Dictionary of Occupational Titles to link skills to tasks to workers to industries to establish the effect of computerization on skills.

This study was an intuitive response to the effects of computerization which were already being felt in the 1980s. In the current recession, it would be hard to say that there is such a trend that is affecting skill demand. If anything a more ambitious study that actually tries to aggregate skill demand and supply would be needed to evaluate the mismatch argument.

Similar to those who argue that job losses in manufacturing and construction is causing structural unemployment, the skill based argument lacks any data whatsoever. Any argument based on skill mismatch must therefore collapse under the weight of available data (or lack thereof).

Tuesday, February 8, 2011

Structual Unemployment

It has been argued that the high unemployment rate in the current recession is mostly structural and because of this AD/fiscal stimulus policies will likely have small effects. Moreover, it has also been said that many of the jobs lost will never return.

For instance, in The Economist:

... America’s labour market has developed structural problems that may explain why it is struggling to respond. ... One is that the recession hit certain industries, such as manufacturing and construction, especially hard. Jobs lost there will return slowly if at all, and people turned out of factories and building sites may be poorly suited to openings in growing fields such as health care and education.

It is hard to evaluate what these statements really mean without a definition of structural or what it means that the jobs will never return. For the rest of the post, structural unemployment will be taken to imply that the jobs lost in construction and durable goods manufacturing will not be coming back.

The first thought is the following: Why is losing jobs in construction and manufacturing considered structural whereas the loss of jobs during the dot-com bust was not? One obvious reason is that the information and computer services sector was (and perhaps still is) a growing industry. However, durable goods manufacturing is considered a declining industry. Did the financial crisis merely accelerate its decline? How can one evaluate the statement that most jobs lost will not be coming back?

The chart shows total employment in the construction (blue) and durable goods manufacturing industries (industry). There is an unmistakable upward trend in construction and a similar unmistakable downward trend in durable goods manufacturing. It is also fair to say that some of the jobs lost will not be coming back. But it is also fair to say that following every recession, some of the jobs lost also came back. One definition of structural might be to take the ratio of job losses during a recession to job gains 6 months to a year after a recession in an industry to make an argument of what the shortfall in employment would have been without a recession to defend one's a view that a recession has caused structural unemployment.

However, it appears that navel gazing and polemics seem to be the norm in the blogosphere as far as this issue is concerned.

Sunday, February 6, 2011

Lack of innovation in household appliances

Paul Krugman's piece on the lack of innovation in the kitchen should be extended to household appliances. For someone who has just spent the past 40 minutes cleaning out the roller brush of a $400 Miele vacuum cleaner, I can safely attest to the fact that there has been a definite lack of innovation.

I mean cleaning a vacuum cleaner. Really!

Direct Costs of recruiting

This article surprised me:'s basic rate is $395 per job posting, though it offers volume discounts. Companies also pay to search the resumes that applicants have posted. (Jobseekers can access the sites for free.) Considering that some Fortune 500 companies hire thousands of workers a year, even in tough times, the cost of listing all their open jobs can approach $1 million.

The article talks about the Direct Employers Association:

Bill Warren founded an early online job board in the 1990s, helped kick-start an industry and was president of, one of the leading Internet career sites. But these days he's not very happy with the results.

So he's taking another crack at it, going after Monster, Career Builder and similar commercial job sites. Warren is starting a nonprofit job listing system that could lower the costs that employers pay to list positions and make the process easier and more fruitful for applicants.

... Warren, 68, says that those commercial sites charge employers so much to list openings that the companies don't post all their jobs – leaving potential applicants unaware of opportunities. Warren also believes that the sites push too much advertising on jobseekers and include too many "work at home" scam jobs.

... DirectEmployers' software will automatically code such listings to make them easily searchable by city or occupation. The association also will sort the listings in as many as 30,000 regional ".job" Web addresses it hopes to begin rolling out in March, such as "" That will help people search for jobs in specific places. The group hopes to add thousands of occupational domain names, such as "," later this year.

Companies that belong to the association pay a $15,000 annual membership fee and will receive prominent placement on the ".jobs" Web sites. Smaller companies can purchase a ".jobs" domain name for about $125 a year and then post jobs for free. They can also work through their state employment agencies, which post jobs online at no charge.

At those prices, the new ".jobs" system could be another online innovation that undercuts what currently exists – much as the invention of job boards themselves undermined newspaper help-wanted ads.

In an age of Craigslist and listing on your own website, I would have expected that the rates would have been less than $100 per posting.

Prediction failure

Raghu Rajan's take on economists' failure to predict the crisis:

I would argue that three factors largely explain our collective failure: specialization, the difficulty of forecasting, and the disengagement of much of the profession from the real world.

... Because the profession rewards only careful, well-supported, but necessarily narrow analysis, few economists try to span sub-fields.

Even if they did, they would shy away from forecasting. The main advantage that academic economists’ have over professional forecasters may be their greater awareness of established relationships between factors. What is hardest to forecast, though, are turning points – when the old relationships break down. While there may be some factors that signal turning points – a run-up in short-term leverage and asset prices, for example, often presages a bust – they are not infallible predictors of trouble to come.

The meager professional rewards for breadth, coupled with the inaccuracy and reputational risk associated with forecasting, leads to disengagement for most academics. And it may well be that academic economists have little to say about short-term economic movements, so that forecasting, with all its errors, is best left to professional forecasters.

My thoughts were here. However, I had some other ones as well that others have already pointed out:

If I predict something bad will happen and if I take steps to prevent it from happening does this mean that my prediction was wrong? For instance, a fortune teller informs me that I will be in an auto crash today and as a consequence of the warning, I avoid going out of the house. Does this mean that the fortune teller was wrong?

If economists have predicted nine of the last five recessions does this mean that due to feedback effects, homo economicus have taken step to avoid four of them? If an engineer predicts that 30 percent of bridges in this country will collapse in the next 5 years and the government took steps to prevent it does this mean that the engineer was wrong?

Two books on dot-com businesses

One was Blown to Bits by Philip Evans and Thomas Wurster which is an overblown gimmicky book about how to take advantage of the dot com boom back in 2000. (I'm a little late to the party.) It's filled with hyperbole and one liners that management consultants use to peddle their trade to unsuspecting CEOs: How do you think we should take advantage of the Internet? Their response: First, you have to take your current business and blow it up, dude. Then you have to think like an insurgent and blow your competition up. (Hence the title of their book.) Any question is answered by: Blow it up.

The second more useful book is by Patricia Seybold, with detailed case studies of what works and what doesn't. The book is very informative but unfortunately what she describes as cutting edge then is average now and unfortunately, for web sites and companies that have achieved the level of penetration and implementation that she describes have nowhere to go but down.

Some of my recent experiences: A tree fell on our phone line and I tried to get Verizon to come out to hang it back up. The web offered no help after drilling through all the pages on Tech Support, I couldn't come up with a phone number to call except the main phone number (more on this later.) So I submitted a form for advice which came back with a response that said we cannot help you, please call the main number. So I do and to actually schedule a service is next to impossible with the obtuse menus which only gave me my account balance. Seriously! After calling the number five times and going through different paths of the system I finally stumbled on an actual live person. It really couldn't get any harder.

Trying to use miles on a partner airline is incredibly is impossible on the web – yes, you have to call someone! In this day and age, can you imagine! While scheduling flights, hotel reservations etc by web is a no-brainer, the technologies do not scale down. Yes, I would love to be able to do this for dentists, doctors, etc. Imagine, having to talk to some one for something as simple as this!

What is also interesting to note is that while companies like Dell have been successful selling over the web they have also begun reselling via big box stores again and while companies like Gateway who used to be successful with a web presence have been 'blown to bits' by Dell and HP so that they are now selling exclusively in stores. In some ways this reflects the fact that buying computers is no longer as complicated as it used to be and requires less customization than it used to.

Saturday, February 5, 2011

Two books on crisis events that I never even heard about

The first is by The Greatest Trade Ever by Gregory Zuckerman. This is essentially a story of how a group 'mavericks' who believed that the country was gripped in a frenzy of the housing bubble were determined to short the market and how they figured out how the way to do it. The title of the book refers to the short selling done by John Paulson a hedge fund manager with very little knowledge of the mortgage market although it is also a story of market incompleteness – mortgages couldn't be shorted directly, nor could their securitized versions. Neither could credit default swaps until the contracts became more or less standardized (at least that was my reading of it) and the emergence of the ABX index eventually made it easier to short the market. Those who were able to profit from the subprime crisis believe of course that had they been able to directly short the market, the bubble would not have been quite as frothy. This is also a story of how Deutsche Bank led by Greg Lippman shorted the market even as his counterparts were going long. There are also other stories of traders trying to stay liquid as the markets moved against them in their short positions. All in all a good read.

The second is The Quants by Scott Patterson. It details the meltdown they suffered in August 2007 (something that I wasn't even aware of) as well as the history of hedge funds – mainly Ed Thorp's role in developing quantitative methods to exploit mispricing in the convertible bond market for corporate bonds. I view the role of the quants as peripheral to the crisis although the book explains how stock prices can rise in the midst of a crisis as the Dow was doing. The quants have to buy massive amounts of stock to return to the prime brokerages the stock they borrowed as they attempt to unwind their positions, if I understand this correctly. The book also offers small glimpses into the their world and it would seem that all of the characters have a penchant for gambling (card counting in Vegas seems to be an induction rite while in college) and graduating to high stakes poker after they have made their billions. The book started off really badly (for me) though:

Peter Mueller stepped into the posh Versailles Room of the century- old St. Regis Hotel in midtown Manhattan and took in the glittering scene in a glance. It wasn’t the trio of cut-glass chandeliers hung from a gilt-laden ceiling that caught his attention, nor the pair of antique floor-to-ceiling mirrors to his left, nor the guests’ svelte Armani suits and gem-studded dresses. Something else in the air made him smile: the smell of money. And the sweet perfume of something he loved even more: pure unbridled testosterone fueled competition.

Fortunately, the prose got better (or at least not worse) later.

These two events caught me somewhat by surprise, especially since I thought I was keeping tabs on the crisis. I had heard of John Paulson's trade long after the fact when he had to testify

Pepco yet again

The recent snowstorm dubbed Commutageddon brought down power lines and left almost 400,000 people in the metro area without power. Pepco's service was again thrust into the limelight and previously I had said that power service here was no better than the 3rd world. In this report from WaPo an irate customer said:

"Show me one capital city of an industrialized country where people have to move to hotels once or twice a year because their local utility provider cannot keep the power on."

Certainly's Pepco's performance has been less than stellar as another WaPo report finds:

... the average Pepco customer experienced 70 percent more outages than customers of other big city utilities that took part in one 2009 survey. And the lights stayed out more than twice as long.

Pepco's reliability began declining five years ago, records show; company officials acknowledge that they have known of the problem but that they only started to focus on it more recently.

Moreover, Pepco has long blamed trees as a primary culprit for the frequency and duration of its outages, implying that the problem is beyond its control. But that explanation does not hold up under scrutiny, The Post analysis found. By far, Pepco equipment failures, not trees, caused the most sustained power interruptions last year.

... But Pepco's reliability problems are more pervasive. Some of Pepco's most disturbing failures come quietly on days with no violent weather, according to The Post's analysis of industry data, interviews with experts and a review of thousands of pages of documents.

In recent years, Pepco has placed near the bottom for daily reliability in surveys that compared power companies around the country. Pepco tends to have more sustained power interruptions, defined as those lasting longer than five minutes. And when the lights go dark, they tend to stay off longer. In one 2008 survey, Pepco finished last among participating utility companies on two of three reliability measurements, records filed with regulators show. Pepco stopped participating in that annual study after its last-place finish.

Should we blame it on free markets and deregulation that emphasizes profits over investment in equipment?

... Pepco and its investors have enjoyed attractive earnings and share prices that have nearly doubled since 2009.

... Pepco's internal records show that in 2009 the company's workers identified equipment failures as the most common cause of outages, accounting for 44 percent. That was a 24-point increase over the previous year.

And the report goes on to compare:

The Washington region's three primary power providers have similar percentages of aboveground and underground lines. Pepco has buried 56 percent of its lines, BGE has buried 62 percent and Dominion Virginia has 58 percent of its lines in Northern Virginia underground (versus 38 percent systemwide).

Most memorable among the storms was this year's Snowmageddon, which brought more than two feet of snow to Montgomery County and shut down the region for days. Pepco lost power to almost 98,000 customers at the storm's height, in an outage that began at 7 p.m. Feb. 5, and did not fully restore service for about a week, federal data show.

Dominion, which serves about three times as many customers in Virginia and parts of North Carolina, lost power to 105,000 customers in an outage that began seven hours later and was declared resolved after about 29 hours.

In other words, Dominion's service stayed active longer and was fully restored more quickly - even though roughly the same number of customers were affected. BGE did not report a major outage.

Those figures are reinforced by a study by the Maryland Office of People's Counsel, an agency that represents consumers. The study found that during the storms in February, Pepco customers suffered the longest outages among customers of the six biggest power companies serving Maryland.

Pepco outages averaged 13.6 hours. Other companies had average outages of about six to eight hours; BGE customers suffered interruptions averaging 8.1 hours.

If deregulation is to blame, wouldn't all services be just as lousy as Pepco's or perhaps we should be looking at accounting practices and compensation schemes at Pepco that encourages short-term profits?

What about side effects?

While the reliability debate continues, Howard Hartmeyer, co-owner of J & H Power Equipment in Crofton, is watching his business thrive. He says generator sales, primarily to homes in Pepco's service area, have jumped 60 percent since June, crowding out all other work at his family-run company. His biggest seller is a natural gas generator that costs about $9,400 installed and can power an entire house.

SAS Graphics

Doing graphs in SAS is a pain – mainly because of the fact that options for high resolution graphics are device specific. JPEG is what I usually work with for inclusion into LaTex documents but they never come out as crisp as I like them to be. What the SAS folks really need to do is come out with specific recommendations for specific types of output: JPEG, BMP, etc for specific types of purposes, that is if we want to use this type of image for a journal article, what should the options be. What if I want to use them for a web page? What are the options for a 3x5 or a 4x6?

I realize that the combinations can be mind boggling but they really need to lay down some kind of recommendations because I ended up spending two hours trying to get the graphics options just right for a high resolution JPEG for 3x5 in.

This was what I came up with for Windows 7, SAS 9.2 Release 1:

goptions reset=all gsfmode=replace device = jpeg gsfname = grafout htext=0.9 fontres=presentation ftext=Verdana xmax=6in ymax=6in hsize=5in vsize=3.5in xpixels=3600 ypixels=3600;

Unfortunately, even with running PROC FONTREG I keep getting the message that font 'Verdana' cannot be used. Oh well, I'll have to figure it out another day when I have two hours to kill.