November 09, 2006
Zero Tolerance Manifesto
Based on some conversations at my present job, I decided to write up what I learned during my time at BEA. These are the things it takes to keep a large development team productive. It only really works when these things are built into the culture. You will need tools and infrastructure to make this practical, but if you instill this into your culture, your team will build what you need when you need it.
Just to be clear, the most important thing you do is check in new or improved code into the product. The things below are preconditions to checking in code, the things you do to be a professional developer. If you're doing this right, these things will speed you up, not slow you down. These are not an excuse to decrease your sense of urgency about moving your project forward. Have you checked in code yet?
These are in order of priority:
If you want to break the build in the privacy of your own machine, that's your business. The second you break the group build, the morale and productivity of every person in the product group dives towards zero. Unacceptable.
Here are the steps you *must* take in order to avoid breaking the build.
Don't f'ing break the build. Ever.
- build your changes before you check them in. Make sure you sync up with product line before you do it -- building against a month old copy of source doesn't really tell you anything.
- stick around after you checkin and make sure the group build succeeds. if you broke it, DROP EVERYTHING AND FIX IT. Fix it like your job depends on it. Because it does.
- don't check in to a broken build. the second you check in to a broken build, you become part of the problem -- you better start looking for the solution. It sucks to wait, but that's what you should do. Or better, start looking for the person who broke it or his friends and make them fix it. Or better still, figure out what's broken and propose the fix.
Tests are the teams safety net. Having all the tests passing all the time makes it dirt simple to figure out if you've broken anything with a change. When it's dirt simple to figure that out, you will have more confidence in your ability to make changes safely -- you will write more code and won't shy away from refactoring or taking on changes in new areas.
Don't break tests
- run the tests for all affected areas and fix any regressions *before* you check in.
- check the results of the group tests after you checkin -- assume any failures are yours until you explicitly rule it out. If you broke tests, fix them before you work on anything else.
- broken tests are not an excuse to ignore the tests. be sure your change doesn't make the situation worse. if the tests that are broken directly cover the area you are going to be working on, you should probably fix the tests before you make your change. If the product isn't currently at 100% pass rate, then you'll have to baseline the product before you make the product and compare it the results after your change. yes, this sucks, but do it anyway.
Tests are your safety net, even for your own code. If you want to be productive, having tests that show something is working gives you a great sanity check on your code, but it also pays dividends over the long haul -- your tests keep you from breaking your own code.
Tests are also your shield. If you write a feature and someone later discovers that some part of the feature that didn't have a test is now broken, guess who's going to have to fix it? You. Even if it was someone elses change that broke the feature. The best way to avoid this is to have tests for every aspect of your code that matters. Then it's everyone's responsibility to keep that test passing with every checkin and you don't have to worry about someone else breaking our stuff.
write tests. lots of 'em
- write unit tests for the public methods on all your classes.
- write functional tests to verify that the key requirements of your feature are working.
- write functional test to verify that your dependencies on other parts of the product are still being satisfied. think of this as writing diagnostics -- when these tests break, you should know where to look to solve the problem.
- when something turns up broken without a test, write one. don't let the same regression happen twice.
It is inevitable that at some point you will have to work in two branches of the code at once, usually because one release is ramping down while another is ramping up. When this happens, it's important not to let the backlog of differences between the two versions build up. The sooner you integrate a change forward, the more likely it is to work and the more likely you are to remember what you changed and why.
During parallel development, integrate changes early and often
- When you fix a bug or finish a feature in version X, integrate it forward to version x+1 immediately (apply rules 1-3 to integrations too). No sense waiting, it's only going to get harder.
It inevitable that bugs will sneak through even the most exhaustive testing suite. The unfortunate part of this is that these bugs always come up after you've moved on to something else. This is not an excuse to tolerate bugs, however. If you put off fixing the bug until "someday", chances are you won't fix it or that it will take you a lot longer to fix then than if you just fixed it today.
Fix bugs right away.
- When a bug comes in, fix it. Whatever you're working on, come to a natural stopping point, then set it aside and fix your bug. If a bug turns out to be too large to be accomodated in the slack of your current sprint, put it on the backlog -- and do it first thing next sprint.
- Leave slack in your sprint commitments to accomodate bug fixing.
- Don't defer bugs indefinitely. If a bug isn't important enough to fix now, then you should consider closing the bug. This is especially true for large bugs or minor enhancements of marginal value. Keeping it around is a drain on your attention and it invites you to keep other bugs with it. Don't be careless, but don't be overly cautious either -- if a bugs important, it'll come back.
July 17, 2006
Declining Cost of Production
I went through a phase of reading books that related economics to biology. If you think about it, when you've got a really good theory of economics, it should help you explain the things we observe in nature (animal behavior, evolution, etc).
During this time, I ran into this "law" -- the cost of to produce a unit of any thing declines on an inverse power law relationship with the number of units produced (on a global basis). And it turns out this is true for many things, including such basic things as eggs. This was first observed by Kenneth Arrow, in his 1962 paper "The Economic Implications of Learning by Doing". I ran into it in Bionomics by Michael Rothschild.
You hear a lot about Moore's law and Metcalfe's law. I'm always surprised you don't hear more about this one, since its at least as profound in its impact on new technologies.
May 21, 2005
The Fed starts backing down health of Real Estate Market
From the Wall Street Journal: The Fed Starts to Show Concern Over Bubble
For a long time, Federal Reserve Chairman Alan Greenspan
dismissed suggestions that the U.S. was in the early stages of a housing bubble.
He talked about the extraordinary demand for houses among hard-working
immigrants. He emphasized that housing, unlike stocks, is a local market, so
it's almost impossible to have a national housing bubble. He explained that it's
hard to speculate in a house that you own because to sell it you have to move
But there has been a little more concern creeping into his
commentary in the past few months. "We do have characteristics of bubbles in
certain areas, but not, as best I can judge, nationwide," he told a House
committee in February. Mr. Greenspan speaks to the Economic Club of New York at
lunchtime tomorrow. If housing comes up in his remarks or if he is questioned on
the subject by one of the prominent economists there, look for the Fed chairman
to mention -- as Fed Governor Donald Kohn did recently -- the upturn in people
buying vacation homes, second homes or other homes on the risky bet that housing
prices will continue to rise as they have lately.
Mr. Greenspan hasn't yet hit the "irrational exuberance" gong,
the phrase he used to warn about the stock market in December 1996. The Fed and
other bank regulators, however, this week warned banks to take more care with
home-equity loans, noting that such loans are "subject to increased risk if
interest rates rise and home values decline." (Did you say decline? Gulp.) Even
a slowing of the pace of increase in housing prices probably would dent consumer
spending, which, for the past couple of years, has been helped by Americans
tapping their home equity.
Other Fed officials have begun to express some anxiety. In a
speech last month, Mr. Kohn said, "A couple of years ago I was fairly confident
that the rise in real-estate prices primarily reflected low interest rates, good
growth in disposable income and favorable demographics." Mr. Kohn was a longtime
adviser to Mr. Greenspan before his appointment to the Fed board.
No longer. "Prices have gone up far enough since then relative
to interest rates, rents and incomes to raise questions; recent reports from
professionals in the housing market suggest an increasing volume of transactions
by investors, who...may be expecting the recent trend of price increases to
continue," Mr. Kohn said.
Interestingly, the one place I saw these speeched headlined, the headline was "Greenspan say no real estate buble". I guess the story is the improbable rise of real estate, not the risk of the real estate market.
The thing is that Greenspan only ever speaks about the national market in the aggregate because that's his job. Its not his job to manage regional economies (in fact, that might interfere with his management of the national economy). When he finally recognizes a real estate bubble, it will only be because it has reached national proportions.
Posted by dapkus at 12:38 PM
wikipedia is becoming the source for answers
From John Battelle's Searchblog: Wikipedia and Search
A nice piece
penned by Max Kalehoff.
A ranking of all Web sites based on the total volume of traffic received directly from search engines placed Wikipedia at 146 in June 2004. But in September 2004 it jumped in the ranking to 93; 71 in December 2004; and in March 2005, it was the 33rd most popular site in terms of visits received from search engines.
That means Wikipedia is impacting not only the trivial results of our Internet searches, but increasingly what content we consume and the types of answers we find to larger questions. This is a profound statement for anyone competing in the marketplace for attention to content and ideas.
Interesting. Wikis took off like weeds at work last year. The thing that's been most valuable about the team wiki is that's been a quick place to host a page that you need to share and possibly collaborate on. We *didn't* get a a beautifully tended garden like wikipedia. I think it helps that wikipedia its modelled on an encyclopedia -- the structure is simple and clear and there are clear ground rules for what an entry should look like.
Posted by dapkus at 11:00 AM
anybody need an agent?
The LA Times ran A Glut in the Market for Homes
last week, an article about how much interest in being a real estate agent has surged.
More than 22,000 applicants took the state's real estate exam in April, nearly three times as many as in April 2003, according to the Department of Real Estate. To handle the surge, the department has rented six test centers around the state to supplement the five it already has.
The last time so many people wanted to sell real estate in California was in 1990. In what might be an ominous sign for the current boom, that year marked a peak in the housing market.
There are 437,000 agents in California, enough to form the state's eighth-largest city. With only 680,000 home sales a year, competition for listings can be savage.
I guess all those day traders had to find something to do. So, each agent gets an average of less that 1.5 sales a year. If the average house price was $500,000, and they kept all of their 3% (which they don't), that's $20k a year. Wow. You'd think the numbers alone would be enough to discourage them. Guess all we need to do sell more houses for more money :)
Posted by dapkus at 09:29 AM
May 16, 2005
(not) the formula for business success
There are a number of really great books about business strategy and the challenges of managment for high tech company (e.g. Crossing the Chasm
, Innovator's Dilemma
, etc). What makes them great is that they provide a theory of business -- they provide an analytical framework for understanding where your business is and a set of principles for guiding your future actions.
The problem with them is that people misunderstand the role of theory in business: they treat these theories like they are a set of rules -- a formula for success -- often at the encouragement of the authors of those books.
There is no substitute for first hand knowledge of your market and a detailed understanding of your business; no business strategy can truely be successful unless it has been informed and shaped by those factors. Theory should play a supporting role in the formation of strategy, not a determining one.
In fact, the misuse of theory is quite rampant. Beyond the books, there are folk theories about everything -- why is Microsoft/Oracle/etc so successful, what makes open source so popular, etc. And it seems many people think that if they could simply repeat the formula, their business will be a success too (e.g. Sun seems to have a terminal case of Microsoft Envy).
The thing is, it wasn't the formula that made Microsoft so successful, it was seeing that the time was right and that company was in the right position to play out a strategy like they did.
This really became clear to me when I was reading this summary of von Clausewitz
, a 19th century military theorist. He was working in the time of the scientific revolution; there was a wide spread belief that it would soon be possible to reduce many fields to a set of governing rules. One school of military thought believed war would soon be reduced to a set of rules for maneuvering troops to achieve advantage.
Clausewitz opposed this school of thought arguing that theory existed not to proscribe behavior but merely to inform the thought of the general in battle. Great strategy came from great generals who had coup d' oeil -- the ability to read the state of battle in progress and intuitively see the opportunities it offered. Theory didn't exist to be applied dogmatically, but simply to help develop the general's coup d'oeil.
Do you think you could learn to be a chess grandmaster by rote learning of a collection of class openings and end games? Of course not -- the number of possible chess games is far too large for that to be feasible as a chess playing technique. When they psychologists study chess grandmasters, to see what allows them to play such a computationally intractable game so well, one thing that stands out is their ability to very quickly read a board and understand it as a set of strategic groupings.
Why should business be different?
Posted by dapkus at 04:19 PM
hm. I'm thinking about dusting off my blog and firing it back up. It's become completely infested with comment / trackback spam. Guess I'll have to rebuild it (better, stronger, faster).
Posted by dapkus at 11:20 AM
October 15, 2004
The Long Tail whips P2P
I followed a link from Due Dilligence
to an excellent article in Wired
on how media-based business (music, books, movies, etc) is fundamentally transformed by the ability to keep *everything* in stock and deliver it for almost nothing. When you can aggregate all of that demand, there is some demand for virtually everything. In fact, it turns out that there is more money to be made on the low-demand things than on the high demand things, in aggregate. This demand has previously gone unmet because it was never possible for physical stores to meet that need conveniently.
I have long suspected this was the case -- but this article makes the case very compellingly and backs it up with real world examples of the long tail in action.
Ironically, this is a business that P2P can't touch. P2P networks are great at finding things common within the network. When the thing you want is rare, the chance that a peer that has it will see your request is fairly small. And the chances get slimmer when you remember that many P2P clients are only occasionally connected, many are freeloaders that don't or can't serve their music, or and many don't have enough bandwidth to give reasonable service. While P2P is a convenient way to get popular things free, it sucks for obscu re stuff -- in theory and in practice.
So, it's not that there isn't big business to be had in digital media on the Internet, it's just that it's fundamentally a different business. In the physical world, media companies are star makers. They select an artist from the talent pool, then invest heavily in production and promotion to turn that artist into a popular phenomenon. The promotion is key -- it's what boosts demand to a level where WalMart can get the inventory turnover it needs to justify giving the album shelfspace.
On line, the business is knowing your product and knowing your customers, so that you can match customers to products that match their tastes, no matter how quirky. It's a different business, but the rewards are bigger than ever.
In fact, if you can make money on the long tail, you don't have to worry about losing the demand that P2P is good at meeting -- the demand for the popular stuff.
Posted by dapkus at 02:16 PM
September 27, 2004
WSJ: Coastal Homeowners Are Now Cashing Out
WSJ.com has an article about coastal homeowners cashing out
and using their profits to buy less expensive houses outright in less expensive markets.
Is it time to take profits on the real-estate boom? The huge rise in prices in thriving cities on or near the coasts has created an arbitrage opportunity for people who have the flexibility to move: Sell Manhattan, buy Montana.
Over the past five years, raging real-estate markets in some coastal areas have more than doubled housing prices, while farther inland prices have risen more moderately. That has stretched the price gap between the middle of the country and the coasts far beyond the norm. The typical home price for the 10 American metropolitan areas with the highest housing prices has jumped to 230% of the national median from 155% five years ago, according to an analysis by Economy.com for The Wall Street Journal.
The main justification for the current house prices that is given is the shortage of supply in hot housing markets. While I think that argument doesn't have much merrit on its face, it's worth pointing out that this is one way a supply shortage could reverse itself.
Posted by dapkus at 10:07 PM
September 10, 2004
Google's "Do no evil"
People make fun of Google's "do no evil" motto. But, in their case, its not just ethics but good business. Google lives and dies by the good will of its users and customers -- none of whom have, in a practical sense, very strong lock in to Google's technology. At the same time, much of the low hanging fruit for a search company like Google, involves doing kinda creepy things (like pay-for-placement, selling data mined from users, etc) . In order to last and avoid the fate of their predecessors, they have to walk a fine line -- "Do no evil" seems like an excellent guide.
Posted by dapkus at 02:05 PM
| TrackBack (0)