Wednesday, August 31st, 2011
The Wall Street Journal put up an interesting interactive map of the “United States of Venture Capital” that tracks VC investments in the first six months of 2011 by metro area and industry segment. Check it out. The opening screen shot is above. (h/t Richard Florida).
My previous post on the so-called “developer drought” prompted some very interesting comments you may want to read. I wanted to share a couple of follow-ups.
While I said that I thought the notion of a “drought” is overblown, it is true that there seem to be fewer developers than in the past and that we are having difficultly scaling up the developer pool in the way that we did say in the dotcom boom. I have a few thoughts on why this might be.
Firstly, there are multiple sources of demand for programmers: tech companies, consultancies, IT departments, etc. While demand seems to be robust on the tech company side and in various tech specialties, there have really been some huge changes in the consultancy and IT department markets that have depressed entry into the programming field and even prompted people to exit it.
Most notable is the rise of offshore development. As recently as 2001 or so, pretty much all development was done onshore. Then the bottom fell out of the dotcom market, the Y2K bubble came and went, and the Indian outsourcers came onto the scene with a vengeance. Today, there’s an enormous focus on the consulting and IT department sides of the house to move as much work as possible offshore, because the costs are so much lower. And having done development with teams in India, Spain, and Argentina, I can tell you the talent is solid and the motivation levels high in those places.
Along with that, we’ve seen the rise of a theory that says, as one famous article put it, “IT Doesn’t Matter.” The embedded notion is that IT is becoming more of a utility and less of a differentiator, thus the focus will switch to managing it like any other cost. That is to say, cut it as much as possible. This is certainly no universal view, but is influential in many respects.
And corporate IT hasn’t changed its fundamental paradigms as frequently in the 2000′s compared with the 1990′s, leading to the commoditization of many functions such as screwing in SAP systems.
Give America’s youth some credit for getting it on this one. They saw the long term trend here and decided to say No, Thanks to that type of career. They watched what happened to manufacturing as a 30 year squeeze that has yet to abate (and likely never will) made it a tough and increasingly poorly paid sector in which to work and in which you constantly walk around with a target on your back. Fear of that happening to them looms large. I have a friend who was on the advisory board of a Big Ten business school. That school at one point had seen enrollment in its MIS program drop from 300 to less than 30 and the school was contemplating dropping that program altogether. The days when Andersen Consulting would hire thousands of bright people off campuses, put them through boot camp to teach them the basics of coding, then send them off to move mountains are over – at least in the United States. I think we probably have seen a large drop off in the number of people entering the tech profession.
Now you might be saying to yourself, “IT? MIS majors? What the heck is that? I need real programmers.” And therein lies another trend I observed, namely the increasing estrangement between corporate IT and the tech world, and the resulting desire for not just programmers by tech companies, but a certain type of programmer.
As the IT world went one direction, the tech company world went another. Open source LAMP-stack architectures sent the cost of software to zero for a new web company. The rise of cloud computing has driven hardware and hosting costs to near zero as well. This dramatically reduced barriers to entry. Also, while corporations stayed close to home with Java and .NET, a lot of the rest of the world moved on, to architectures like Rails and platforms like iOS and Android instead of the web. This caused a reconstituting of what a startup looks like and even spawned new philosophies and methodologies of software development such as the “lean startup” and 37 Signals “Getting Real.” A lot of the success of these companies is rooted in understanding the user/customer and design (perhaps inspired by the success of Apple), as well as rapid iterations – things still done best in America.
My own Telestrian app is an example of what this new world has wrought. A decade ago, it would have taken millions of dollars to start a company. Today, I built Telestrian myself with sweat equity and a tiny amount of cash for point assistance (such as my lawyer) under a bootstrap model.
These two evolving paths have driven a wedge between the IT/consulting model and the tech model, and to some extent bifurcated the labor market and the culture to an extent far greater than in the past.
One commenter on the previous post noted: “The developers that startups want to hire are a special breed: broad and deep technical skills, very current on technology and practices, ambitious and self-directed, derives satisfaction from the startup’s vision so they will work for less $$, good communicator, proven track record of success.” This shows that tech startups are increasingly focused on a narrow model of developer to staff their teams.
I think one of the challenges facing tech hiring is that tech companies have set up artificial boundaries around their potential labor force. Back in the dotcom days, we also had to radically scale up our programming staff at the macro level. But we did it. Sure, a lot of the people went into the field who shouldn’t have and they wrote a lot of crappy code. But you absolutely had to have an e-Commerce site, the CEO wouldn’t take No for an answer, and guess what? Lots and lots of stuff of pretty good quality actually got written in a very short period of time – and even without the architectures we have that today make it so much easier to do things.
Perhaps today’s models work in a certain sense, but if there isn’t the labor force to staff them, then maybe that model needs to be rethought. Perhaps we do need to raise more capital and be open to hiring from a broader pool of people, for example.
I think the dot com era holds two important lessons. Firstly, companies hired in bright people and let them grow into the positions and figure the internet architectures out. I see no reason why we can’t do this again today. I had lunch with the VP of Applications for a .NET IT shop in Chicago today and asked him point blank if he had problems finding .NET programmers to hire or as contractors. He said No. He uses mostly contractors these days, but there isn’t a shortage on the market. Why not grab some of the .NET IT guys who’ve shown they can develop code and teach them or let them figure out what they need to do for your Rails app or whatever? Another commenter said he’s had three open programmer positions for nearly six months and hasn’t been able to fill them. In six months, even a merely competent programmer could have cross trained to a new platform and gotten productive – and would probably be fired up about getting to do so. Again, remember, we took a lot of raw material in during the dot com era and ended up a lot of good tech stuff with it. And people want a job in which they can learn something new, stretch themselves, and grow. Hire one of those water walkers of the type profiled above and they’ll get quickly bored and move on for greener pa$ture$.
Second, salaries went way up during the dot com era. I think there was a misunderstanding by some on my point around pay. Raising wages isn’t just a method of reallocating a fixed pool of talent from banking to tech or something like that. In a typical supply curve, sellers are willing to supply more product at a higher price. A higher price attracts new supply to the market. There are lots of people who left the programming field for something else. Or like me got promoted into management. Or who are in legacy technologies and haven’t skilled up on something new. Or who are in school and trying to make up their minds about what to do. Etc. These are all potential sources of new or latent talent in the marketplace. The price signal is important to bring that potential supply into the marketplace, along with potentially some business/staffing model changes to make the tech world more appealing to those people.
Of course, whether higher pay attracts them also depends on whether they believe that those salaries are permanent or part of another bubble. People got burned when the dot com bubble blew up. Even today you read about another bubble. Once bitten, twice shy, so it will take time to convince people. But while there will always been ups and downs, I believe tech is here to stay in America. We just need to find out how to attract a labor force to it and build business models that work well with the labor force they can get.
Update: Check out this new piece over at New Geography: “Supply of Tech Workers Greater Than Estimated Demand”