What Andy Giveth, Bill Taketh Away
Deep Blue, the supercomputer that outmatched—and supremely out-calculated—Chess Grandmaster Garry Kasparov, was a wonder to behold in 1997. A large and bulky machine compared to today's computers, Deep Blue was in a performance class all its own. This machine, publicized as part of an "Artificial Intelligence" revolution, brute-forced all of its chess-winning calculations with a computing power of 11.8 GFLOPS.
Wow! 11.8 GFLOPS! That's incredible!
And for a machine of its size and ingenuity in 1997, it really was. But what about today?
The iPhone X comes swinging in at 600 GFLOPS.
Why bring up geeky metrics such as FLOPS?
Yes, we all generally know that our hardware has seen exponential processing power improvements in the past half-century. That's Moore's Law!
But besides Moore's Law, there's one law that is much more relevant today and not as well known:
Andy and Bill's Law
"What Andy Giveth, Bill Taketh Away"
It's a funny line because it speaks truth. Andy Grove, legendary CEO of Intel, led his company to create the most powerful and widely-adopted microprocessors in the world. Bill Gates, with Microsoft, would then write software to suck up all of that processing power. Andy would then create more powerful processors, Bill would use it up too—and the cycle would continue.
With more "breathing room" available on hardware, software performance became less and less of a priority. And if you're Microsoft in the 1990s with a near-monopoly business, you care about it even less. Who hasn't gotten frustrated using Microsoft Office? And let's not even get into certain iterations of Windows or Internet Explorer.
Andy and Bill's Law is even more relevant today, because it has seeped into a lot of widely used, production code. Even many software developers aren't cognizant of it! We take the hardware that we write programs with for granted, and end up writing subpar code—just because we can get away with it.
But it does affect software users.
Let me give you some examples.
I'm writing this because of recent issues I've encountered, specifically in the education domain.
McGraw-Hill: A Digital Dinosaur
McGraw-Hill is a publisher trying to offer students digital services like a caveman trying to sell rocks at a quantum-computing convention.
When I purchased the digital version of a textbook that I was required to use through McGraw-Hill, I found that I couldn't access it online without a secure connection:
I paid $140 for this.
Not only is this unacceptable by modern web development standards, the user interface is also garbage. If I wanted to read my super expensive book from my phone, I wouldn't be able to—it's not mobile friendly.
(I’m trying not to show any pages with content from my $140 online book for obvious reasons.)
Pages lag. Images take forever to load. I find myself grumbling often while using the platform, which takes me out of my "learning experience."
And why would I download the mobile app when it has 1 star and the last thing I want is to install yet another app?
The cherry on top? Many of its services still run on Flash, which is old, insecure, and inaccessible on mobile. Even Adobe is discontinuing support.
Canvas LMS: Unstable and Unapologetic
While a Learning Management System may not be “mission critical,” it absolutely is critical for student use—especially when all grading and class tracking runs through the platform.
In short, your expensive higher-ed degree is at the mercy of someone else’s system architecture. That’s not trivial.
One regular weeknight (when students are expected to be online), I was getting an Internal Server Error on Canvas:
This lasted for the better part of an hour.
For a paid Learning Management System, this should not happen.
This isn’t like Instagram crashing so I can’t see what my BFFs had for brunch. I get frustrated because I have important work to do.
And what is Canvas' response?
Absolutely nothing.
No tweet. No email from the university. Just roll with the crashes as if they never happened.
The Architecture Question
I’m not sure if the people running these systems understand, but good software architecture is complex:
What? Did they think they could get away with a basic monolith forever?
(Sorry, I still love you Ruby on Rails)
“CRUD applications are simple!”
Yes—if you’re building a prototype at a hackathon.
Not if you’re designing for scale and performance.
So what’s the point?
Clearly, I’m not the happiest person about these encounters with subpar software.
But I’m not expecting Canvas LMS or McGraw-Hill to make revolutionary changes anytime soon.
What I am hoping for is to reach software developers and decision makers, especially those just starting out:
Don’t taketh away.
Be smart about your software quality.
Ask yourself:
- “How do my experiences scale?”
- “How critical are features XYZ?”
- “What are our system architecture’s failsafes?”
There are deeper discussions to be had—on abstractions affecting code performance, and the evident problems of Object-Oriented Programming—but those can wait for another day.
For now, just ask yourself:
“Do I want to be like Internet Explorer?”