Sunday, December 29, 2013

Thoughts on Vertical Integration: The Cloud and (Gasp!) Ma Bell

Although it has been a couple of months since the hysteria of re:Invent, there has been a great deal of consternation by hoards of people in enterprise IT over the emergence of Amazon AWS. The fear is that AWS and it's main competitors will substantially disrupt the market for enterprise infrastructure and create the next big oligopoly or even monopoly in the technology industry: The Cloud. In this weekend's Barrons, we find Tiernan Ray making the near term case for mediocrity, if not eventual doom for the big enterprise technology players (Note: subscription may be required):

No Silver Lining in the Cloud for Blue-Chip Techs

The key concern among investors - both venture and public market -  and users everywhere really centers around the notion of AWS being a "dominant exchange". This is what happens when the game becomes one where the entire solution set is owned by a single player. In this scenario, scale matters most, and customers can no longer see an incentive to go anywhere else for their needs. 

Well, rather than try an imagine a bleak and scary future where all IT is dominated by a large evil corporate entity, it turns out we can put our fantasies aside and just open a few history books. If you are an American of a certain age, or even a current or former resident of select countries, you have probably lived in a world that was eerily similar to what everyone is afraid of. In the twentieth century, the paradigm of technological vertical integration has to be the old, pre-breakup, Bell System - also known as American Telephone & Telegraph. It turns out that looking at AT&T's business model and history is hugely instructive for those who are planning strategies in the Everything-As-A-Service world of the early 21st century.

Voice-As-A-Service circa 1900

If it weren't for some bad behavior and a bunch of concomitant anti-trust lawsuits, the story of AT&T would be an example of a great American capitalist success. Famously, Alexander Graham Bell invented the telephone in the late 19th century and set out to commercialize his discovery with his eponymous company. In order to deliver the service, the Bell System became an end-to-end solution for making voice calls to other Bell customers. This meant that they provided the telephones to their customers, they strung the wires on the streets, built and maintained central offices where operators would help patch point-to-point connections for customer calls, manufactured all of the equipment that the operators and customers used, and serviced all parts of the system in case of trouble.

To make the story interesting, the company was an aspiring monopolist. They refused to interconnect with other telephone providers unless they sold out to Bell, and they set the standard for how interconnection would work when they owned such a subsidiary. This got them into trouble when they bought out their biggest competitor, Western Union, and attracted the attention of the anti-trust authorities of the pre-World War I federal government. Eventually, AT&T was granted a legal monopoly and became a regulated entity. What is fascinating to consider, however, is how they operated within those constraints for 60 years. While the world has changed a lot in the last 100 years, the recent discourse around the cloud has had me making a comparison: Is The Cloud - or AWS, in particular - destined to be the new Bell System?

The tradeoffs of this extreme level of vertical integration are particularly interesting. As a result of owning everything, The Bell System all actually worked. The phone system, the switch gear, the telephones, and everything along the way was overbuilt and extremely reliable. One struggles to remember a time in the 1970s when their phone service was unreliable or when they needed to troubleshoot a telephone problem. On the other hand, the pace of innovation was stiflingly slow. If it weren't for telecom deregulation in the 80s and 90s, we would not have meaningfully fast data network access available today. We would not have VoIP. Long distance calling would be inexplicably expensive. Most importantly for most of us, the economy would be bereft of trillions of dollars in market value that has been derived from IP networking.

When one considers the direction of the products and technology in the 80's, it is obvious that AT&T did not perish as a result of anti-trust action, but rather from obsolescence, just like every other tech company. They were so busy attending to that enormous customer base, that they could not address the needs of their more innovative customers. In fact, any enterprises that had advanced telecom needs had already found ways to work around AT&T long before it was broken up. Media companies were sending transmissions via satellite links, and large enterprises were buying private switching equipment from a slew of competing companies that catered more to their needs than the one-size-fits-all approach offered by "the phone company".  If it weren't for their status as a legal monopoly, they would have been disrupted much sooner.

Is IT-As-A-Service Destined to Remake Enterprise IT?

Looking at the Cloud, the key innovation that the established enterprise players are finding disruptive is really a business model. Turning a product into a service in a way that makes economic sense is a ground-up endeavor. Setting that aside, this generation of technologies built for The Cloud are just as easily disrupted as previous generations. Whether or not the establishment players will be able to hang on to their customers hinges on the same factors that have always been important: Will the incumbents be able to adjust to the evolving needs of IT, or will someone new take the business away? Most importantly, do AWS and the other service vendors provide any meaningfully new technologies that need to be consumed as a service rather than a product? It is still far to early to tell. Certainly, it is not yet time to sell the incumbents short. The next few years will be interesting, though.

Sunday, November 17, 2013

Can A Slingshot Take Down The AWS Juggernaut?

If your purpose in life is to entertain the gods, you might as well put on a good show.

Amazon's Re:Invent show last week turned out to be quite a coming out party for the predominant infrastructure service provider. There were a number of interesting announcements that moved the stocks of perceived competitors, and of course, there was a disproportionately large showing of attendees at the Venetian in Las Vegas. For me, the highlight was provided by good friend, and super smart VC, Jerry Chen, who went on  The Cube to lay down the gauntlet starting at about 7:00 in the video below:

Watch live video from on

The salient point Jerry made was the comparison between an ascendant Amazon AWS and the now-decidedly incumbent Microsoft circa the 1990's. His argument is that there are really two investable bets to make (well, he said three, but the third is less interesting to me). The first is that there are companies to be formed that can make AWS more enterprise-ready, and the second is that you can invest in someone that will take down AWS. If nothing else, it is heartening to see Jerry has taken up the mantle of the venture capitalist, and has pointed to the next big hill for the army to conquer. As entrepreneurial cannon fodder in this battle, I greatly appreciate the affirmation.

Well, since I can't sleep well on airplanes, and I had the misfortune of taking an overnight flight home from Las Vegas, I had plenty of time to think about this matter. After reaching back into my memories as a newly minted engineer in 1991-1993, it is obvious that history does indeed rhyme if it does not repeat. Hence, I think I have come to a conclusion on which will be the better bet if I were investing venture money right now.

A Tale of Two Microsofts

Having the pleasure of attending some of the original Win32 Developer's conferences, the Microsoft PDC's, during the early 90's, as well as the WinHEC conferences up until 2008, I had a semi-privileged, front row view of the revolution that Microsoft led over the course of a decade. Given the day-to-day reality in which we operate, it is easy to forget that the world at the time of the 1993 PDC was radically different than what we have today. In fact, most readers may find my observations of being a Windows "dev" back then quite amusing.

As I recall, developing code for Windows before Windows NT was released was a trying process. The development tools before Visual C++ were not that friendly. The operating systems were either very buggy and unreleased (as in the 32 bit NT), or super-duper buggy and shipping (Windows 3.0, 3.1, 3.11). In the case of the latter releases, you had no meaningful memory isolation between tasks, and so a simple coding typo could take down the whole box while you were working. Having learned to code in the Berkeley and AT&T UNIX operating systems, it felt like I was playing with a toy rather than a tool of enterprise transformation.

Back then, the "real" systems that businesses ran on were either UNIX-based mid range servers, or mainframes. The PC client was really just an over-powered terminal that also had some desktop apps, as well as file and printer sharing. Strategy discussions in PC software companies centered around how overpowered the clients were in comparison to the big iron, and what that fact foretold about the future. Indeed, the developers' conferences were largely full of optimistic young developers trying to change the role of the PC architecture. The rhetoric coming from Gates, Ballmer, and Allchin included lots of chest-pounding bravado about how good the next generation would be, and how it would take on a huge role in the enterprise - if only we developers would agree to write awesome new apps for Win32. We all would go home with palpable excitement and a religious zeal.

A little over a decade later, the world was radically changed. Perhaps not for the better. Substantially all of the mid-range systems vendors had vanished. Windows was dominating both the client and well as the back office of enterprises everywhere. The ecosystem that had developed all those awesome Windows apps had largely been cannibalized by Microsoft, and all those developers had moved on to writing web apps or other cool, Linux-based  things that were out of the way of the perceived MS predatory machinery. It was here that compute and storage virtualization emerged as dominant technologies. It is easy to argue that AWS and VMware tipped the datacenter market from the new incumbents right here.

Amazon Looks More Like the MS of the 90's

Last week's conference was full of developers. The rhetoric brought back memories of the 90's. The show floor was full of small, venture-funded companies. The representatives of the big incumbents were trying to make themselves invisible. Everyone, including the VC's, were talking about how AWS was not ready for the enterprise - yet. Although rumors suggest that AWS is a $5 billion revenue stream, it does not look like the big players are using it yet.

It's hard to envision that AWS can be tipped over when it's user base is not the demanding enterprises that make up the bulk of IT spend in the market. The fact that it got to this point based on the grass-roots support of a big developer community makes it very scary. People are right to be afraid that this company could be the next big IT monopolist. However, if you are an investor, would you consider it an easier bet to take them down, or to help them achieve the dominance they seek? The key, in my opinion, rests with Amazon. If they take a page from their neighbors in Redmond, and eat their ecosystem, the community will move on very quickly, and the bet is an easy one.

Monday, October 28, 2013

Technology, The Mirage of Shareholder Value, and Why Icahn Should Just Retire

“We may see the small Value God has for Riches, by the People he gives them to.”
Alexander Pope (1688-1744)

There are some days that I feel as if I am a late night talk show host blessed with a particularly inept politician. This past week was particularly fortuitous for me, as Carl Icahn has proven himself to be a gift that keeps on giving. I have been fairly blunt in my assessments of his spectacular effort to morph the Dell LBO into a goat rodeo (See here and here.) Hence, it makes me crazy that he managed to show up in the headlines once again this week, engaging in the same fatuous behavior that makes him a caricature of what my friends in finance would call "dumb money". This time, despite the ostensibly impossible odds of success, he decided to take on Apple. For a quick primer, the NY Times Dealbook blog does a great job:

Icahn Amps Up Pressure on Apple, but His Stake Limits His Leverage

It is hard to overstate the the pointlessness of this move. As I type this, Apple's market capitalization is an immense $484 billion. For Icahn to get the 5% stake in Apple needed to incite his usual proxy cat fight, he would need to come up with well north of $20 billion in cash. Which would be fine, except that he just doesn't have that kind of money. How do we know?  Because he was such an abject failure at topping the $24.4 billion offer that Michael Dell and Silver Lake were making for Dell. With bluffing skills like these, he needs to be kept away from the poker table at all costs. I haven't had this much fun watching M&A since a fish oil company called Zapata tried to buy an Internet company 6 times its size with stock in during the .COM boom. Here's some advice for Carl: Stick to raiding businesses where you might have some rudimentary understanding of their operations, like, say, lumber, fish oil, or buggy whips. If you you can't find any, it is far better to quit at the top of your game rather than have us remember you as a laughing stock.

With that said, Icahn is but a pathological symptom of a much bigger problem in the industry: Management's over reliance on optimizing for a high stock price rather than for building a sustainable business. This is especially true for the information technology industry, where the reductionist private equity strategy of cutting research and development in order to run the business for cash flows makes no sense. The fact is that no business can really be run as an accounting identity. In technology however, the product sets and platforms have a half life measured in low single-digit years. Killing a single dollar of R&D will set up for certain failure two years into the future. Tragically, stock prices get managed in 90 day intervals, so, if you are a CEO, firing your entire engineering organization will make you look like a hero in 12 months. In 30 months, you will likely yourself be fired; and your company, your customers, and your employees will be irremediably damaged.

The correct way to run a technology business - any business, for that matter - is to focus on the needs of the customer first. Build a world class product and solution set. Provide a lavish support infrastructure and build lasting relationships. The only way to do this is to assemble a talented and productive team, and show them that their contributions are valued. Build their loyalty. Enable them to to delight customers, and support them and their needs. The wants of the typical shareholder are so far removed from business success because the typical institutional shareholder is far removed from the customer. Does Carl Ichan care about the product needs of Apple's or Dell's customers? Absolutely not. He is clearly eyeing the cash in the bank and would hire new management to implement his redistributive strategy.

Managing for shareholder value rather than for happy customers is a problem that has reached almost crisis proportions. Thankfully, at least in technology, there are always cadres of small nimble companies out there who focus on their customers. They are the ones that are privately held and VC-backed, however. In most successful companies backed by venture capital, not only are the employees focused on the customer, but so are the investors. Maybe it's not a coincidence then, but these investors seem to score some of the most amazing returns for their money over the long term. Hopefully that will not go unnoticed. In business, customers - and not shareholders - always come first.

Wednesday, October 23, 2013

The Box-ification of Software Continues

Well, Tuesday came and went and we finally go the big announcement from Apple's Tim Cook. As usual, the world was focusing on all the fun toys - the "boxes" - and, to a large extent, Apple delivered the usual array of sustaining innovations that will make their competitors seethe for the next 6 months. Despite the inevitable critiques, and the occasional gaffe (Maps, anyone?), the products will sell and achieve wide adoption. All this will happen in the face of withering competition from the commoditization experts in Asia. What got my attention yesterday, though, was one of the more modest announcements that is likely to get forgotten until it really matters, i.e. when someone's business is disrupted: Apple is giving away productivity software with it's new machines.

As I type this article into my MacBook Pro, while controlling my stereo from my iPad sitting across my desk, I feel like I should repeat what I have said before: The slave-like attention to integration, user experience, and polish will always win out over gimmickry and slipshod commoditization. For this reason, I think it is quite significant to consider what Apple is doing here, and what it portends for product development trends going forward. There's always an underlying strategy to these types of moves.

First, for the Apple fanatics amongst you, it surely has not gone unnoticed that MS Office on the Mac has become somewhat of a second class citizen. While the Windows suite was recently refreshed, it appears that the Mac version has not had a major refresh since 2011. More importantly, MS has chosen not to support any of the iOS platforms with its flagship productivity suite. There are two consequences of this decision, and both are bad for my friends in Redmond. First, it makes the lives of Mac users more disjoint and unpleasant if their data is locked up inside Office documents that they cannot edit from iPads, Minis, and iPhones. This gap creates opportunities for other people to fill the missing bits, by offering apps to provide the missing functionality. In Apple's case, it also allows them to provide customers a chance to experience what an integrated suite might look like. If you haven't tried the iWork suite, you really might want to now, especially the touch enabled versions for the tablets. Unsurprisingly, you might even start to prefer iWork over the collection of things you have now.

The flip side of this is that Microsoft is not moving the user interface idiom forward on the Apple platform any more. The future is touch and mobile, and we already know Microsoft is not there today even on their own Windows OS. They will proceed to lose ground as the masses continue to buy iPads in lieu of Windows laptops. You would think that learning to get tablets and touch right should be a strategic imperative. Apparently this is not so for them, unless it manifests itself in Windows first. They seem to have forgotten that they themselves learned how to build a GUI by building Word and Excel for the Mac long before Windows was around. In the old days, Microsoft used to be too paranoid to let these kinds of things happen. I don't think it's too far-fetched to infer that with hundreds of millions of iPads out there, it would only take a few good features in iWork to seriously damage Office. Not supporting these devices is very dangerous for MS.

The most profound part of this move, however, is the continuing theme of vertical product integration that is sweeping the low end of the technology industry. I have been talking about it for quite some time in the context of some of the things we did at EqualLogic with iSCSI storage: The lower end of any market abhors complexity. You cannot sell them a bucket of parts and expect them to build a solution from it. They will reward the manufacturers that pull together an end-to-end experience that is flawless and integrated. This is something that is unique to technology - people are afraid of it, and are always looking for an easy way to use it. EqualLogic ran the same playbook as Apple has been running: They are building all the software into a single package and polishing the experience to delight their end user. That package is a single system, either a laptop, or a tablet. In our case it was a storage array.

The big idea to draw from all this is that technology mass markets will continue expect these vertically integrated solutions. We see Apple gradually extending theirs this week. We have been seeing Microsoft doing the same thing with their Slate line and, more recently, their acquisition of Nokia's tablet and handset business. We even see it happening in various parts of the IT infrastructure markets, where more integrated, easy to use products are favored over those that require specialization. This is a mega trend that will affect people's expectations of how to consume technology. More importantly, it will substantially raise the bar for those who want to build that technology.

Thursday, October 17, 2013

Um, Can I Work From Home? Please?

It really is amazing how organizations can take a relatively simple idea and supersize it to the point where it is unrecognizable from its original conception. Last week, in a bizarre reversal, we learned that HP has decided that their people should not work from home any more if at all possible. As usual, Arik Hesseldahl from AllThingsD summarizes it the best:

Computing and technology services giant Hewlett-Packard, which appears to be taking a page from Yahoo CEO Marissa Mayer, has quietly begun enacting a policy requiring employees to work from the office and not from home.

While it hasn’t yet reached the level of a company-wide directive with the same jarring effect as a new policy put in place by Yahoo earlier this year, HP employees are being told by bosses that if they can work at the office, they should work at the office.

I thought that I'd discuss this a little because the subject comes up frequently for me and, unsurprisingly, there is no correct answer - just a lot of nuance. So, why are these large organizations recalling their remote workers? What exactly is the "state of the art" when it comes to workplace design? How did we get here in the first place, where CEOs need to issue these kinds of decrees? Well, its quite interesting.

First, let me just make the incendiary pronouncement: Except in a small subset of job roles, working from home for extended periods is probably not a best practice. Sorry. If you read between the lines of the HP announcement, you get a pretty good reason for getting people to the office every day: immediacy of contact. This is not, as it turns out, the only reason to bring people together every day. There are all sorts of others, ranging from morale, to the rotation of the earth and the speed of light.

If you want to get into some very interesting academic research on the subject, the faculty at MIT's Sloan School have plenty of reading material for you. For example, Alex Pentland of the Media Lab has done a great deal of research that shows the value of people who work social circles within office environments. Long before him, Thomas Allen, a distinguished professor at the same institution, published a book that discussed a lot of the same kinds of issues. The overwhelming conclusion you arrive at from the reading is that the effectiveness of an organizations largely depends on how well, and how quickly, information gets disseminated and processed through the ranks. People working in intellectual or physical isolation create problems. Having whole teams of people that never see each other is unambiguously bad for productivity and creativity. There's a reason that "open" (i.e. cubicle-free) office environments are popping up everywhere: It forces people to talk to each other and collaborate.

The problem is that, for many years, it has been all the rage to geographically disperse organizations and try to make that work. Usually, the justification for this is money - the desire to cost reduce some aspect of the process by leveraging cheap labor in low wage locales. At some point it became a real estate cost reduction: Office space can be smaller if everyone worked from home, and so a company can pay less rent. Just like everything else related to pathological management behavior, if a little of something is good, a lot must be better.

Of course, these schemes have a cost, and that cost is usually management overhead. The managers have the unenviable task of getting all these people to work together. It usually means having to say the same stuff multiple times to different people in different time zones, waiting days for consensus to form, backtracking and starting over when misunderstandings inevitably arise, and a lot of wasted time. It also means that communication many times becomes mostly hierarchical. Information sharing becomes stunted unless the manager is an amazing person. Worse yet, it adversely affects corporate culture in ways too numerous to mention here. It's clear that this is what Meg Whitman is ruminating on at HP. It is also an interesting thought experiment for us to consider what this trend portends for the future of knowledge work.

As for the original question about working from home, I do allow it for a small subset of people. If you are one of them, it means that I have worked very close to you before, and I understand your work habits and your thought processes intimately. It means that I have decided I will allocate the time to interact with you daily, and maybe hourly.  It means that you have committed to being in the office one week a month at a minimum, and I will make sure that your travel is paid for. In fact, if this is you, you can consider it the supreme compliment from me: You are incredibly valuable to the organization, because I do not have the time to do this for more than a couple of people and still do my job well. If this is not you, I'll talk to you at the espresso machine...

Sunday, October 6, 2013

Sunday Night Scotch: Disruption Is So... Well... Disruptive

The reason that God was able to create the world in seven days is He didn't have to worry about the installed base.  - Enzo Torresi

Maybe it's a sad testament to the human condition, but the technology industry does not seem to lend itself to attracting individuals with humility or self awareness. Since my very early days as a junior software engineer, I have noted that it is only a matter of time before many archetypical tech nerds, having achieved a modicum of monetary success, start to behave erratically. Before long, you see them standing on top of the roof at corporate headquarters, laughing maniacally. In a thunderstorm. Waiving a five-iron in the air. I'll grant that this happens in other professions as well, but there's something about the rags-to-riches nature of tech that makes it all the more poignant.

If you are an executive in a leadership position at any firm, it therefore follows that you need to go through the history of other people's mistakes obsessively so you learn from them. (I recommend doing so very clinically - like the NTSB examining the wreckage of an airplane crash.) What you inevitably find, tragically, is that history really does rhyme even if it doesn't repeat. With that thought in mind, I present to you the scariest part of my reading list from last week: The Globe and Mail's investigative piece on the failure of Blackberry:

In it, you'll find a freakish retelling of pretty much every cliché of every tech downfall you can imagine. If you are a technology leader, and you'd like to engage in some negative reinforcement therapy, I recommend it highly. Likewise, if you need an excuse to have a strong drink, you'll find all you need there. Here are some of my thoughts:

Disruption Is Never Evident Until After The Fact

In her book entitled Being Wrong: Adventures In The Margin Of Error, Katherine Schulz states that being wrong feels exactly like being right. What we remember most painfully is the moment when we realize we are wrong. What will make you cringe, therefore, is the fact that much of the investigative narrative does not occur in the months after the iPhone's market success in 2007 and 2008. It is a story set in events of 2012 and maybe late 2011. It wasn't until then that the mobile phone market had been disrupted to the point where Blackberry sales were beginning a secular, irreversible decline. That is a long time. Friends, that is what disruption looks like.

The point here is that you can't advertise a sea change. Smart people do not screw up this badly unless they are completely unprepared. By definition, a technology market can't be disrupted if all the players were anticipating the change. So, all you Storage Geeks take note: Things like flash technology are really more evolutionary. Everyone knows about them and they are all making bets. It is unlikely that anyone will go out of business. In fact, recent failed IPO's are testament to that. True disruption sneaks up on you.

Listening To Your Customers Can Kill You Too

Conventional business wisdom would have executives in a bear-hug with customers, listening and catering intently to their every whim. In a disruptive situation, the customers you are listening to aren't the ones that are going to rock your world. It is the early adopters and the rebels that are driving the change. A large, established technology company cannot pay attention to these people because they do not represent a viably large market. When it would have mattered to do otherwise, Blackberry was clearly making decisions based on the needs of their huge revenue base: the one that was paying them for physical keyboards on their phones. They were doing precisely the correct thing. They were wrong to do so.

Just hiring a CTO to go watch these people wont work either. I know exactly what it feels like to point out that a fringe minority has a better idea about how to do things. In a publicly held company, the business worships at the "church of what's happening now". There is no accounting gimmick or spreadsheet that you can conjure, which makes an ROI case for investing in a disruptive technology. You will lose your case every time. When the return is 4 years into the future, no one will care. This is why there are venture capitalists.

Finally, we can see what can go wrong when you send off  a bunch of rebels to "do the right thing" as Blackberry did with the touch screen products. They can easily lose touch with your customer base, and build something that your existing customers wont transition to. At the same time, they may not get any new customers either, unless they are truly brilliant in their execution.

No One Can Explain the Ten Year Run Apple Has Had

When the history books are written for business over the last decade, I am not sure how they will be able to distill the moves Apple made into a real repeatable formula. It is not lost on me the hypocrisy of criticizing the seemingly stupid, calcified, large incumbent technology organizations, when the player bringing the disruption is actually a big company, too. The ability to attack huge, seemingly unrelated markets from a position of financial strength helped. Not having to drag an existing user base along also seemed to go a long way. So did the organizational structure that protected the development of these new products from the vicissitudes brought forth by the markets. ...But getting it right every time on these giant bets? That is pure genius or pure luck.


Monday, September 30, 2013

So, Where Do Clouds Go To Die?

The last week's events have been very interesting, although perhaps more unnoticed than I think they should have been. We began the week with the rumor, ultimately proven to be true, that Nirvanix was about to take a dirt nap. This was a very interesting company for a number of reasons: First, they were primarily a cloud storage provider. Second, they were not Amazon or Google. They were venture funded, and clearly unprofitable. Most importantly, they had a good number of customers.

Admittedly, reading headlines like this has somewhat of a NASCAR effect on me: Watching the cars crash can be most of the entertainment. There were certainly customers that would rightly lose their composure if something like this were to happen. I didn't hear of a whole lot of angst, however. In fact, except for the obvious admonitions that this is a bad development from a number of folks in the investment community, I was surprised how quietly this all went down. Nobody wrote any scathing copy urging IT pros to call their storage vendors and buy good old fashioned physical arrays, although I'm sure folks are adjusting their PowerPoint decks for their sales calls this week. (Why this seems so casual is the subject of another post...)

I was pretty much all set to shrug this event off as if it were a typical market consolidation shift until I got an email from Ziptr. Ziptr, for those who are unaware, was a secure document sharing service. As I write this a few days later, there's no point in linking to their web site. They, too, are long gone.  Their final email to me read as follows:

Dear Lazarus,
Please export your data by 12:00 noon Eastern tomorrow, September 27, 2013.  Ziptr is closing, and all Ziptr products will be discontinued as of tomorrow.

If you do not need future access to your data, you do not need to take any action.  Once we shutdown the Ziptr service this Friday, we will be deleting ALL user data that remains in our systems. 

To guide you through the process of exporting your important information, please visit or contact  


Ziptr Support
Ziptr, Inc.

Oddly, several people had used the service to send me confidential documents. These did not belong to me, thankfully, but it suddenly hit me: The company is gone, and it has left behind some assets that someone is going to liquidate. In the old days, no one would consider anything but the physical assets to have any value, i.e. servers, printers, desks, chairs, etc. In today's analytics-driven world however, data and it's metadata have become a sort of currency. The value of whole companies can be largely attributed to the data they house, and the leverage that they are able to create from it. This now-defunct company was a data management company.

Thus, a series of seemingly unanswerable questions keep coming up in my head. Some of them are quite interesting. For example, in a liquidation of a company, who owns the data? The creditors? (Check the terms of use...) What obligation do they have to abide by the same agreements as the defunct entity? (None, apparently...) If they were to receive  a subpoena for your data, would they fight on your behalf? (I'm thinking not.) If they were to make your data public, would you have any legal recourse?

Here's what might freak everyone out the most: It's clear that the Ziptr guys were intent on at least trying to do the right thing to preserve security of the data they were holding - i.e. they say they will delete it. Whether they succeed at this is another matter. Whether they had other physical media containing data as backup is yet another further consideration. What happens if you are dropping your data into a service provider cloud that left the encryption up to you? What if you didn't bother to encrypt your data? Most importantly, what if they didn't bother to destroy the data after a bankruptcy?

In the Brave New World in which we live today, it's clear that the legal framework we are using to operate may be sorely inadequate to protect us from the kind of predatory entities that may exist out there. I am not aware of any of the scenarios I am contemplating having really been tested in American courts. The outcomes may not be in favor of the entity that created the data. It really is unclear who would win in a lawsuit. So what do you do?

First, it is doubtful these happenings will materially dent the growth of the cloud business model. We have gone too far to reverse, and the value proposition is established. On the other hand, a bit more due diligence may be required before you sign on to a service provider if they are going to be holding your data. If you are going to put it in a cloud, you really need to think about a different sort of "disaster recovery", because our current set of laws are probably not going to adapt in an expedient fashion to the march of technology. At a minimum, this means that you need to be sure you retain the rights to your data, that it is secure, that you can move it when needed, and that you can be certain it can be vaporized if you deem it necessary. Is this the beginning of a "Data Bill of Rights"? Well, we should be talking about this.