Much has been written about the internet’s disruption of longstanding models for education. The success of Khan Academy in K-12, the launch of Coursera, edX and others in higher education, the publicity garnered by the Thiel fellowships, and the aggressive funding of edu start-ups everywhere (EdSurge provides solid coverage) all illustrate the opportunity to take a well-established system and do things differently. Digital channels enable a shift to explore new ways to learn that are not confined to one era of your life (undergraduate years) or location-based (on campus). The goals are clear: reduce cost, provide equal or better outcomes, and spread educational access to the world.
Is health care next? Today, the founder of RunKeeper focused on mobile health with a terrifically titled GigaOm piece, Your phone will soon be your new doctor. The piece focuses on mobile and the myriad apps that have sprung up to connect mobile device users to their day-to-day health awareness and performance. But, beyond mobile, the larger point about digital disruption is there. Health care today is episodic — you go to the doctor once a year and/or when you’re ill, it’s location-based (in the doctor’s office, out of your element), and it’s heavily reliant on your memory as a diagnostic device. Mobile health can be just the opposite: always on, always with you, and reliant on hard data (number of steps taken, heart rate, meal consumption) in a way the traditional health care model cannot be. The future is here: welcome to your personal health KPIs.
Venture capitalists and nonprofits alike are moving fast ahead into both education and health care, exploring ways digital channels can create new markets amidst disruption. New Pew research released about the news industry this week feels like a cautionary tale: it’s time to focus on new delivery models before the older ones are on life support. In education and in health care, the hunt for digitally-driven reimagination is on.
It takes a curious mixture of narcissism, introspection, and discipline to engage in personal analytics on any level, much less dialed up to Feltronesque quantified self. This quick download of my Facebook activity since September 2010 confirms:
- I use words (189) more than pictures (47), and neglect video (1) almost entirely
- My friends are a bit more female (53%) than male (47%), hail from 24 countries, and include 1 fervent monarchist
- Inexplicably, I post most often at 9pm on a Tuesday night
Aside from the vague shock of realizing where one’s time goes (I recommend Rescue Time for a sobering application analyzing web use), the possibilities for personal analytics are enormous. Nike+ FuelBand is a great example of a personal analytics service that’s addictive and competitive, and effectively connects long term fitness goals to short term behavior.
What are the effects of aggregating personal behaviors at this level — not even explicit consumer tastes, just daily habits? We live our lives in public as never before, and what may seem mundane — the precise time we’re gazing into the iPhone’s glowing screen on a Tuesday evening — could lead to useful personal insights, relevant commercial applications, and of course privacy concerns.
Today marks the sesquicentennial of the Battle of Antietam, whose 23,000 casualties marked the bloodiest single day in American military history. The American Experience film on Death and the Civil War (based on Drew Gilpin Faust’s This Republic of Suffering) focuses on the scale of the death, and the corresponding lack of societal structures to manage death logistics and communications. It seems hard to believe but before the Civil War, there was no national cemetery system, no federally recognized system for identifying the dead, and no means of informing family members. The federal government would, by the end of the war, have constructed “a new bureaucracy of death.”
A then-emerging new technology played a part in people’s perception of death. Mathew Brady’s October 1862 photography exposition in New York shocked viewers with what was for many the first graphic photography of death. While it’s unlikely viewers in New York would have known the subjects, it brought home an understanding of the loss in a way that both augmented and circumvented newspaper accounts.
The public photography in the Brady show marked a paradigmatic change. Over the next century and a half the death business gets routinized and bureaucratized, with funeral homes, death notices, $25 caskets, and online guest books. In the late 2000s, widespread adoption of social media immeasurably quickens and widens the notification process. Like its disruptive effects in other industries, social media “debureaucratizes” death communications in a new and interesting way.
The public nature of the way we broadcast our lives through social networks today necessarily transforms how we communicate death. New technology enables us to share the mundane to an astonishing level, with applications like Instagram transforming the way we experience the mid-day meals of others. Documenting the birth and times of our babies is so ubiquitous that if you want to block those images, there’s an app for that. But there are few apps, and no established social protocols for announcing death through social media. Twitter is rife with death rumors for public figures, but what are the rights and responsibilities of next-of-kin of a regular person, suddenly deceased?
A terse Wikipedia entry of “death and the Internet” tells you the facts: Gmail will pass on your email to next of kin while Yahoo declines; Facebook will, with proper documentation, allow you to create a memorial for the deceased. Last month an app called If I die launched aimed at the pre-dead — it allows people to leave video and text messages in the event of their own sudden demise. There’s a growing need, but the both the structures (what happens to email accounts?) and the practices (how do I announce a death on Facebook?) are not yet mature.
150 years after Antietam, the military’s notification teams are skilled in the delivery of bad news and corresponding support structures — but now struggle to stay ahead of social networks to inform families. Even without a sudden catalytic event of a war destroying 2% of the population to prompt the shift, social norms around online communication are forced to adapt for death as they have for life.
Digital projects, like all software endeavors, are easily derailed. Developing a site or application is initially seductive — the discovery phase presents a green field where all frustrations about your existing or missing capabilities can be magically erased by the New Thing. The early vision is grand — the stakeholders are picturing the end result not against a platform or service they have seen, but against a perfect unicorn. Spirits are high; people are engaged.
Requirements are the painful beginning of a process of understanding what’s possible. There’s what’s technically possible, and what’s possible given business owners’ goals, budget, and realistic maintenance capability. Tough compromises are made — in a best case scenario, rapid prototyping can improve the result. Content strategy may or may not come up, and let’s hope it does. It’s a discipline helpful for curtailing impassioned pleas for six-minute welcome videos and for preventing people without the bandwidth to update a Twitter feed from signing on for weekly 500-word blog posts.
Then a full design phase kicks in, and stakeholder engage fully in imagery, color palettes, and line leading. Hopes are once again high, and PhotoShop goes a long way to erase the sting of features lost in the requirements phase. The joy of the Bright and Shiny Object is in full effect.
During the build, compromises are made; inevitably, some degree of requirements shifts. The technology supports the main use cases, but developers managing cross-platform delivery may have to make hard decisions about the fringe. Even in an eight-week sprint influenced by Agile, stakeholders are exhausted.
Enter the final 10%. The final 10% is what separates a just-OK user experience from a terrific one. It’s closely related to the effort Ben Lerer pointed to in the NYT yesterday. The final 10% means making sure you’ve taken care of the tedious details that ensure your project has meaningful search results; delivers analytics to inform future iterations (and not just fill inboxes); plays well with social media; and that content syndicates neatly where it’s supposed to.
The final 10% isn’t sexy — it’s stuff like delivering small fixes to the administrative interface that will cumulatively make the difference between adoption and rejection, or checking that the adaptive design is breaking just right in the 84,563 flavors of Android. The final 10% isn’t capital-V Vision like the discovery phase or beauty like the design phase, but it’s a big predictor of digital project success.
Don’t do something 90 percent well and hope that it’ll slide through. Don’t rely on luck. You have to make your own luck. The only thing you can do is try your absolute best to do the right thing. And then if it doesn’t work out, you know there’s nothing else you can do
-New York Times interview with Thrillist’s Ben Lerer