Had coffee with a friend recently and we drifted past the conversation of system designs. The crux is that we’ve both worked in organizations that are very bound to very large code bases… But, in many cases — products that have 100M+ users — they are monolithic piles of dung. Compile times that are non trivial, codebases that take minutes to startup, huge memory footprints. Systems that only survive because the DB is just a margin bigger than their usage.
Now as I set on on many other projects and build, code and create I have to wonder am I just creating the fore front of monolithic systems that will eventually succumb to the same pressures? Or what internal force causes organizations to create “service oriented architectures” that have the loose coupling, but yet the high availability that is necessary to insure that you don’t need some HP NonStop system to keep your site running. Is it force of will, design principals, or some bit of Computer Science DNA that isn’t taught?
I’ve gone off an seen the light of component systems, message queues and routers, design for failure, future proof APIs. But, yet it’s something that people tend to turn up their noses at and go back to if it’s not integrated it doesn’t exist…
Other thoughts on how to create a culture that admires distributed simplicity?