“Legioni di imbecilli”

“Legioni di imbecilli.” That’s how Italian author Umberto Eco recently described social media users and their opinions. The man has style.

Keep the phrase in mind, particularly on social media where, in Eco’s estimation, the Internet has “promoted the village idiot to the bearer of truth.” Although Eco’s observation may seem arrogant and condescending, I think that the idea animating it is rather humble.

“Know-it-all” is a common stance. Being more oriented toward uncertainty and discounting claims about the unknown are rarer qualities. In the opening to The Black Swan, Nassim Taleb describes Umberto Eco’s book collection as an “anti-library.” The most valuable titles are those yet unread. They contain depths yet fathomed.

The same could be said about the Internet when information is approached with the skeptical mind of a scientist. Everything must be tested, or as physicist Richard Feynman said, “science is the belief in the ignorance of experts.”

Eco said that idiots making drunken comments in wine bars don’t do too much damage to the community because they can be “quickly silenced.” The Internet has taken away this natural restraint.

The machine never died

After a long journey with my friend Runzour, this Polar RS200 has been reborn to serve a new master.
After a long journey with my friend Runzour, this Polar RS200 has been reborn to serve a new master.

Today I spent $27 on batteries to relaunch a circa-2007 “running computer” given to me by my friend Runzour. While I was deleting his data, I noticed that he had logged more then 4,000 kilometers with it. The chest strap is weathered from use. This is ancient by today’s running-watch standards, but that gives it some character. Many physical objects don’t acquire much character during their first decade of use. The torch is passed. The machine never died. It was just sleeping, temporarily out of juice.

For someone who does as much running as I do, I’m a bit of an anomaly in that I’ve never used a running watch. Thus this is my first foray into this technology. One upside is that all the data the watch produces is intrinsic to this machine, as it has no connection to GPS.

Ubiquitous Wireless Connectivity Should be a Right

Imagine a world where every American could tear up their internet contract.  All those contacts are great for Verizon and Comcast, etc., but I think they are causing a big economic drag — for both individual pocketbooks and for broad-based economic development. I hope within my lifetime to see the moment where business and the market force a positive change.

Now that SpaceX is planning to blanket the US in better satellite coverage, things may start to get more interesting. The internet of things, in which sensors everywhere fix problems and make suggestions, requires ubiquitous connectivity.

Business councils could argue that if WiFi  were considered more of a right than a service, a flood of new economic activity and opportunities would result. And not only businesses would prosper. Poor people, who often use libraries because they cannot pay for internet contacts, could more easily seek new opportunities and create their own enterprises.

Until this happens, they providers are lording it over subscribers. My neighbor, who uses an antenna to get free TV and just needs internet, told me his provider informed him his new rate would rise from $50 to almost $100 when his two-year contact elapsed. The only way to get the contract back down was to agree on a promotional deal that included cable TV.

The providers are not plain dealers. I believe that they have engineered a useful system, but are stuck in short-term revenue maximization mode. The way they behave makes me think of them, as vampires. Hopefully the market undercuts them in a way that promotes human lives and greater prosperity.

Bye Bye Twitter

By Daniel Wilcock


I quit Twitter this month for the following reasons:


  1. Using Twitter is playing with fire when it comes to your life and your reputation. You can lose your job or suffer other similar losses for momentary stupidity or carelessness—or even just for writing something that can be misinterpreted. There’s so much questionable content on Twitter that it’s also easy to embroil yourself by accidentally retweeting it. In the analog world, statements can usually be clarified in the moment. “Did you mean what I think you just said?” That doesn’t always happen, but it’s more likely than on Twitter. Our statements also tend to expire with the air that carries them. This freedom may start to disappear with the possibility of Google Glasses recording all speech. Europeans won the right for their online lives to be forgotten by Google. One day we may cloak our speech in scrambling technology, which we turn off only when we give explicit permission to be recorded. Until then, I’m going to reserve my online expressions to carefully considered media such as this blog. There wasn’t any specific incident that pushed me over the edge. I just sense the potential for one in the future.


  2. I’m not a politician, and I don’t have products to sell, so why am I constantly marketing myself? You may be different from me. You may have a job that requires you to market certain wares, including yourself. I feel fortunate that I’m not in that position except for when I’m job hunting, which I’m not right now. So much of Twitter is pure marketing. Why subject myself to it? Why not choose the ability to discern and think for myself? Why not seek out the voices I find wise, rather than what I’m exposed to by marketers? I’ve used Twitter to market this blog, but I realized that I’m mostly writing this blog to record my own thoughts. If people tune in, cool, but I’ve stopped marketing it.


  3. Twitter is just about as bad as Facebook. Twitter and Facebook are seemingly very different. I used to think Twitter was better because it tends to shoot you into other websites suggested by the people you follow rather than try to hold you in place like Facebook does. It also makes it easier to interact directly with just about anyone, unlike Facebook where public figures are a few steps removed. You can “like” celebrities on Facebook and leave comments for them, but it’s harder to rouse them directly the way a direct tweet can. Despite the differences, the two companies are in the very same business, selling you to corporations. When I quit Facebook more than a year ago, Twitter became its replacement for me to the tune of 1000 tweets. I think I always knew that it was kind of a false choice—like thinking Coke is any better than Pepsi. They both are corrosive to health, at least for me.


  4. It changes people. I’m a fan of Nassim Nicholas Taleb’s writings, but I can hardly stand his persona on Twitter where he gets into quarrels, spouts off rather randomly (pun only somewhat intended), and generally goes against his stated desire to be a private person and not a public intellectual. I think he’d be better off quitting too, especially since he writes in his books about via negativa, removing things that undermine our ability to live well and trusting negative rather than positive advice.


  5. It involves spending time writing to strangers rather than my family and friends. Think about it. I do realize that you can only allow your family and friends to follow you on Twitter. Maybe that’s a good option. Luckily, most of my family and friends are too sensible to be on Twitter. I’m certain I’m missing many family photo albums on Facebook. That is lamentable, but not worth signing back up for an account. Perhaps it will give me an excuse to start calling family members. “Hey, I haven’t seen you guys in forever, especially since I’m not on social media. Could we visit sometime?”


  6. Community is centered on my doorstep, not in cyberspace. I think I’ll let this one stand without further comment.


  7. It’s just not for me. If you use Twitter daily and love it, that’s your right. I’m judging it, but by my own lights. If you love it and see no reason to stop, tweet on my friend.


  8. Last but not least, the way reciprocity works on Twitter bugs me. Follow me and I’ll follow you back. Coded hashtags designed to signal readiness to play this game of stranger accumulation. Drop these same strangers, and they’ll drop you. And then there are a few people I know who didn’t follow me back. Is this because they don’t reciprocate, or because they simply don’t notice in the avalanche of their Twitter feeds? These kinds of questions could drive me crazy with a kind of internet-based palace intrigue. Forget about it! Life is complex enough without needless head games.

Review: The Circle by Dave Eggers

Image from the publisher’s website

By Daniel Wilcock

Google, Facebook, Twitter, Amazon. What if they all got rolled up into one corporation? What if that corporation became increasingly omniscient and swallowed the political world and then the totality of everyone’s quotidian life? Past dystopian novels such as Fahrenheit 451 (which targeted television and the banning/burning of books) plotted the trajectory of an illiterate society ruled by mind-control. The Circle may be as implausible as Fahrenheit 451 in the long run, but it raises a lot of key questions about where we’re headed as a society with the increasing ubiquity of information technology. I can’t think of a better novel to ask these questions through a compelling work of art.

The novel, set in the not-too-distant future, opens with Mae, the main character, beginning a job at the Circle. Openings at the tech firm (which resembles Google) are hard to get and highly coveted. Mae has an inside connection in Annie, her friend from college who has risen to great heights within the company. The new gig rescues her from the dreary job she’s held down at a utility company in the months since graduating. These opening scenes are what I’d imagine the first few days of a new job at Google to be like, only even more cartoonish.

That being said, Mae’s initial job is real work, handling the complaints of companies that advertise and sell products using the Circle and its currency system. Customer satisfaction must hover near 100 percent and reciprocating messages and invitations from fellow workers (inner circle) and followers (outer circle) is expected. Screens on her desk proliferate. The Circle keeps adding digital treadmills under her feet, but she’s remarkably adept—Annie tells her she’ll rise fast, and this sets her up as a kind of ‘chosen one’ figure.

Life outside the circle is painful, complicated, and slow. Mae’s parents struggle with her father’s MS and with battling their insurance company. But Mae still finds some enjoyment in the outside world, kayaking in the San Francisco Bay. These naturalistic interludes stop when Mae gets caught borrowing a kayak from the rental shop after hours by one of the exponentially proliferating “SeeChange” cameras that feed HD video into the Circle. As penitence, Mae decides to “go clear” (an inflection of Scientology) by donning a camera that broadcasts her every move to her growing list of online followers. An increasing proportion of the world’s politicians have gone clear, and the Circle is poised to ensure that everything is known.

What happens to Mae? There are key characters and events I’m leaving out. I don’t want to spoil the book, which really is worth the time. For me, the pages flew by in just a couple of days. I guess you could say I was already pretty receptive to the points that Eggers is making through his fictional craft. Last year I quit Facebook. At the beginning of this year, I decided to stop shopping at Amazon. Recently, I dropped Twitter. My problem with each of them is their tendency to draw humans into their own little marketing-oriented universes. Twitter seemed a bit more useful, as it so easily spits users into other web pages. But it still tracks you for profit and mostly is just a marketing echo chamber. I still use Google, and perhaps this is the company that would be the hardest to avoid since its free services are so ubiquitous (gmail, maps, drive, etc., etc., etc.) and their mastery of the online advertising market is almost complete (“complete” is an important word in The Circle).

Google has a very wide utopian streak. But as Jaron Lanier points out in Who Owns the Future?, the utopian vision has also led to consolidation of money and power. I’d recommend pairing that book with The Circle. In different ways, both authors are calling for underground resistance and disruption of the mega disruptors. Recently I’ve read a lot of Nassim Nicholas Taleb, whose writings—particularly Antifragile—argue against size and speed in favor of things that are decentralized, idiosyncratic, human, ecological. I agree with this line of thinking. I could also see another reader might come to a very different place with The Circle, which is a testament to the book’s understatement. I recommend this book highly.

Blogging in WordPress with MS Word: Any Good?

By Dan Wilcock

I’m posting this entry on my WordPress.com blog using MS Word 2010 as an experiment, just to see whether it’s any good. Word is set up to interface with most of the major free blog hosting companies, as seen below:



So if you’ve got a WordPress blog, like me, or one of the other standard options, the interface should be pretty smooth.

Once you enter your URL into the path they provide and enter your blog password, the wizard will let you know whether your blog has been successfully registered. At first this didn’t work for me, but I got through once I realized that the /xmlrpc.php extension after the blog URL is necessary.

So I’m in and composing this post—So far, so good. Here are five things I appreciate right off the bat:

  1. Better control of special characters. I can use my preprogrammed keystrokes to type in special characters while blogging. For me that em dash (—) I just typed would take two clicks of the mouse to execute in WordPress but I just hit F10 (the keystroke I chose for em dashes, which I use frequently in my writing, perhaps too much). Word is probably better at special characters and symbols than WordPress. For an example, look at what WordPress did to Pema Chödrön’s name on my ‘about me’ page (look under contemporary thinkers). With Word, this comes out a lot smoother. (Update: this corrected bit was simply my error. I used the upper-case letter, not bothering to see there is an appropriate lower-case letter available.)


  2. More upfront control over font size. I’m sure I can code font size by going under the hood and coding it in WordPress’s text viewer, but this is much more civilized.


  3. That screen clipping above is nice/was easy. I copied it by hitting window key + PrtScn, and then cropped and resized it right within the Word document. I have no idea how I’d do that with WordPress.


  4. This list I’m creating does something I’ve been unable to do with WordPress. Namely, I’m able to add an extra space between items, something I think just looks better. Word does this automatically after you hit return between two items on a list and then delete the middle item. It’s a nifty bit of intuitive functionality that I really miss when I’m composing in WordPress.


  5. All those snazzy Smart Art templates, Word’s spell check, etc. This seems to have most of the basic bells and whistles built into Word, such as automatic tables, etc. I’m sure a halfway decent WordPress wizard could repudiate this post showing point by point that composing in Word is unnecessary, but so far this seems to me like a decent way to go.


I don’t have any negative list yet, but will append one once I’ve pressed the green publish arrow the Word puts at the upper left-hand corner of the screen. If the blog spits out a bunch of ampersands afterward, I’ll write my complaints below.


Nope. Now I’m editing the post in WordPress.com, and everything came through perfectly. I think I’ve found a better way for me to blog.

UPDATE (6/23/2014): Word does have a blind spot when it comes to blogging in WordPress: video. There is no button to insert video content, unlike WordPress, which has the handy “Add media” button with easy ‘insert YouTube’ options.

Micro Rentier

By Dan Wilcock

Merriam-Webster defines the French word rentier as “a person who lives on income from property or securities.” Here in America, rentier is an uncommon word. Word’s spell-check underlines it.

But it may become a lot more common thanks to Capital in the Twenty-First Century, the bestselling 700-page overview of inequality in rich countries by Thomas Piketty. I just started reading it, and found myself looking up the word in the dictionary. He uses rentier frequently to describe people whose income derives from capital assets instead of direct payment for labor.

Even though Piketty’s book brilliantly uses data to show that the rentiers are grabbing an increasingly large percentage of total wealth, I can’t help but notice that the lines between rentiers and laborers are increasingly blurring in the rich countries he studies.

A great example of this blurring is the so-called “sharing economy,” which was the focus of a cover story in Wired magazine this month subtitled “how Airbnb, Lyft, and Tinder are teaching us to love strangers.”

Is the owner of a late-model car who moonlights on nights and weekends by driving sloshy revelers around town for Uber a worker or a rentier? A bit of both, I think.

The Wired article opens with a day-in-the-life narrative of a 30-year old woman who works as a freelance yoga instructor and personal trainer when not picking up rides through Lyft. Doesn’t sound like dynastic wealth to me, but nor does it sound much like 9-to-5 for a pay-check labor.

I think the web is turning folks into micro rentiers. Traditional income streams, measured in per capita income and unemployment stats, may have flat-lined in recent years, which is a big factor in the rise of inequality. But the web is making it easier to unlock the asset potential of privately-owned stuff like cars and houses that until now were mostly financial liabilities.

I personally haven’t become a micro rentier yet. Security, privacy, and serenity are all things for which I’m willing to pay the opportunity cost of foregone income. But if I were squeezed, I’d definitely consider sticking my toe in these waters. Our economy may well squeeze most of us into these markets in the next few years, at least those without the keys to dynastic fortunes.

Hoya Saxa: congrats to Georgetown’s graduating technology management students

Spring 2104 Technology Management Capstone Class, Georgetown University
Spring 2104 Technology Management Capstone Class, Georgetown University

Pictured above: This spring’s capstone class. Everyone’s smiling, having just completed Georgetown University’s Technology Management program. (I’m in the blue jacket on the left.)

Congratulations to my classmates. I wish them all the best.

So was it all worth it? Yes, after three years of study I can say that I’m wiser in the ways of technology. I’m not quitting my work as a writer and editor any time soon, but the digital revolution now feels more like something I can harness where useful and ignore the media-driven noise surrounding all things cyber. Before I took this program, my relationship with technology was based more on ignorance and fear. Now, thanks to master teachers like John Gilroy and Pablo Molina (who helped found the program), I can approach technological solutions with far more confidence.

Thanks go to Georgetown University, which paid 70% of my tuition as an employee benefit over the years (I started working for Children’s National in 2012, and since then I’ve finished the program slowly). One of the great aspects of working for Georgetown is that, after one year, they cover tuition entirely. If the study is related to the job, the tuition benefit is tax exempt. Anyone interested in Georgetown’s professional studies programs, of which Technology Management is one, would be very wise to consider applying to work at Georgetown one year out to take advantage of this incredible benefit.

Not every class was wonderful, but the classes were filled with bright technologists and aspiring-technologists (like me) from a broad diversity of backgrounds. There were quite a few students who hailed from Africa, where technology has the power to change things dramatically. A couple of ex-students were advanced enough in their careers that they came back to teach in the program.

The program’s biggest weakness is over-reliance on a business school paradigm of hypothetical business cases and pitches. It would be better to really build things, launch them (even if it’s just a prototype, no elaborate business plan), and shop them around town within DC’s growing tech entrepreneurship scene. Maybe it’s too much to ask for the school to administer that. Student initiative needs to count for something. Entrepreneurship isn’t cookie-cutter. That being said, I think it could be a bit stronger with more of a robust framework tied to real opportunities: “Oh, that slide deck is hypothetical, you say. We’d be glad to take that off your hands and run with it.” –Such words would embolden some graduates to leave the safety nets of their jobs in order to join the start-up fray or start side projects without a care of whether VC-money ever gets involved.

For me, studying technology has made me realize the value of my current work, which is far outside of the realm of IT. Working with words all day is a pleasant way to earn a living, and I feel most fortunate. Thanks to the TM program, if I need to expand on that work on new platforms, through digital videos, etc., I’m ready.

So mission accomplished. Cheers! Mazel Tov! Kampai! etc.


Ignore the noise, but not the real value of STEM

I think we can all agree that perpetual disruption and reinvention can be tiresome, especially when the unspoken subtext is that a lot of folks are going to get fired. A lot of Thomas Friedman columns these days just get booed down. We’ve heard that record before.

Lingering over a cup of loose-leaf-brewed tea, walking or running in the woods, talking in person with family and friends, and reading printed books all bring joy to life. These fulfilling activities were equally available (with the possible exception of books) to our distant ancestors as they are today.

We may in the not-too-distant future transition to spending much of our time working and playing inside virtual-reality. The internet has already brought us halfway there. Facebook is currently spending tens of billions of dollars (!) buying up companies that will complete the trick.

But so much of the innovation that gets touted as disruptive is tacky. Facebook and (alas) Twitter are businesses in which you, the consumer, are the product being sold to corporations that want to sell you stuff you probably don’t need. 3D printing at home is another example. Sure, we can all use CAD files and replicating machines to bring manufacturing in-house and fulfill our wants with ever greater precision. But at the end of the day, it’s mostly just a bunch of customized plastic. (I’d like to know how well the recycling system will work to ensure that what we print can become the raw material of our next creation. If it’s done right, and it becomes more ecological than factory production, I could see myself changing my mind about 3D printing.)

I’m not advocating for a return to analog-everything. Despite the fact that the best things in life may be simple things, I think science, technology, engineering, and mathematics (STEM) have made us much better off than those folks of yesteryear who only had simplicity and their wits.

Better application of science and technology for health and prosperity can be far more profound than the superficial offerings of social media, Google Glass and 3D printing. For me, the part of the STEM revolution that brings value is the power it gives us to better enjoy the simple pleasures of life.

What could be more basic and life-affirming than enjoying good health? Because we’re alive today, we can read medical science books informed by more comprehensive data and better technology. Here are two examples:

I first encountered David Agus, MD, in the pages of Wired magazine. That feature article opened my eyes to a different, more systematic, way of understanding biology, which he outlines in his book The End of Illness. His scientific/systems biology approach allows him to distill some simple recommendations about health that cut through the media hype. Some of these are somewhat surprising, such as his advocacy of cutting vitamins and supplements in favor of eating real food.

Robert Lustig, MD, is famous for “Sugar: the bitter truth,” which has been viewed almost 4.5 million times on YouTube, where I was first exposed to him. Like Agus, Lustig takes a systems approach that makes a compelling case against added sugar and industrial food processing. His book Fat Chance makes that case convincingly and scientifically.

Then there’s our financial health. The simple life is aided by avoiding all the complexities of poverty, and in this regard people-oriented financial technology is making it easier for the disciplined “little guy” to achieve some degree of financial independence (FI). Index investing is a product of the IT revolution, and its champions—people like Vanguard founder Jack Bogle, the financial academic Burton Malkiel, and the increasingly popular blogger Mr. Money Mustache (all three are worth reading)—point out how computing power has made investing simpler and less costly.

Technology allows us to track our health and wealth with greater accuracy than ever before. To the degree that next generation body sensors (like Fitbit) and financial aggregators (such as Personal Capital) aren’t distracting or even all-consuming, these are great reminders that until we die we always have room to grow and improve.

My point is that STEM has the power to transform complexity into life-affirming simplicity, and its power to do so is getting better each day. Underneath all the noise, these are the real life improvements.

Review: The Information Diet

By Dan Wilcock

Here’s a book that most of us could use. Clay Johnson’s book The Information Diet (2012, O’Reilly Media) takes a non-pretentious look at what would be painfully obvious were we not so engrossed: we’ve become information obese. Our brains are constantly fattened for the kill. We need to be a lot more choosy about the quantity and quality of what we put into our heads.

Johnson is an IT guy. He founded Blue State Digital, the digital strategy company that helped Obama first win the presidency in 2008, and has since gone on to found a variety of ventures at the intersection of policy and software development. He writes with a smart, non-pedantic style. Yet he isn’t immune to hyperbole:

The Internet is the single biggest creator of ignorance mankind has ever created, as well as the single biggest eliminator of that ignorance.

Perhaps he’s right about this. Depends on how you define “biggest,” but it’s a bold claim nonetheless. That being said, boldness is a good strong suit to have if you’re going to take on the media matrix in which our minds swim. Rage Against the Machine once critiqued this matrix as follows:

No escape from the mass mind rape
Play it again Jack and then rewind the tape
Play it again and again and again
Until ya mind is locked in

Believin’ all the lies that they are tellin’ ya
Buyin’ all the products that they are sellin’ ya
They say jump, ya say how high
Ya brain dead
Ya gotta fuckin’ bullet in your head (source Metro Lyrics)

Johnson basically concurs with Zack De La Rocha, but without the stridency:

Our attention is the currency that marketers lust for, and it’s about time we started guarding it, consciously, like we guard our bank accounts.

Being an IT guy, Johnson favors a programming approach. Perhaps for most people already addicted to social media and their favorite pundits it’s more like re-programming. He suggests that people move beyond “a reactive model of computing, where you’re constantly being tugged and pulled in every direction and responding to every notification that comes across your screen, into a conscious model, where you’re in complete control of what you’re paying attention to.”

Music to my ears. The human can manage information, not the other way around. So simple it’s kind of dumb, but what’s really dumb is that far too many of us are not awake to this reality. Johnson argues that we have four powerful mental muscles we can flex: searching, filtering, creating, and synthesizing.

Johnson’s framework and the how-to advice that flow from it are the book’s best aspects. Less developed is the larger societal context in which he tries to fit these skills in the book’s final pages. After leaving Blue State, Johnson directed Sunlight Labs, part of the pro-government-transparency Sunlight Foundation, and I think he’s still too close to that foundation’s mission to be objective about it. It may be the case that this mission is what drove him to write the book in the first place (he often recommends going to original source information such as government databases), but for me the book’s general lessons are more valuable–and more universally applicable–than the limited issue of whether Government data gets posted.

All in all though, a very worthwhile book by a socially conscious technologist. More like this, please.