Corporate Guide To Social Media And Complacent Hashtivism

Here’s the first entry under the Writing category. It’s also a “working example” of the kind of post I’ll be putting under this category. 

In this case, I’m starting out with a description of something. Here, it’s social media and one way that it’s used. Because this is filed under the “Cyberpunk” subcategory, the second half is how this modern system is or could be used in the future, written almost in a “handbook of social media dos and don’ts from highly effective corporations” style. 

Again, this isn’t an endorsement or a condemnation of practices, but rather an interpretation of how — in this case — social media might evolve, from the perspective of a cyberpunk universe.

Social Media Now

One of the supposed benefits of social media is that it allows people to address entities larger then them, but also to aggregate their telepresence for a cause. Social media has given individuals the opportunity to broadcast their voices into a void, but to tie their posts together via links and hashtags so that what would otherwise be a single unheard voice merges together with thousands or millions of others in a show of force that is difficult to ignore.

Despite the various troubles that social media has (privacy concerns, pop culture irrelevance, harassment, etc), it is a democratic platform in both design and in purpose.

Social Media Later

The downside to this kind of hashtivism is that it’s only one step above doing nothing at all.

The history of organized movements dead-ended the day that the hashtag was born. Before our ability to reach everywhere on the back of a pound-symbol, human beings had to gather in public in order to make a significant statement. This ranged anywhere from peaceful sit-ins to unruly mobs, but getting on board for something required an effort, which meant that those who actually showed up were really serious about the situation.

Today, it takes almost zero effort to re-Tweet or to sign an online petition. It’s almost a throw-away action that can be done in less time than it takes for a commercial break. Anyone can claim themselves to be on-board with a movement if all it takes is no effort at all, but the best part is that everyone can see you doing it, thanks to the follows and re-Tweets that reach around the globe.

This is beneficial to concerned parties because a pile of Tweets does not an angry mob make, no matter how much sarcasm or profanity is involved. For a savvy organization, be it a corporation or a government,  proper management of hashtivism can make consumers and voters feel that they’re involved in a process by encouraging that they put as little effort into the process as possible.

There’s no guarantee that any hashtags or petitions actually reach anyone who is in any position to do anything about it, nor is there any reassurance that if the aggregated ire is seen by those who can act upon it, any action is taken. Managing the belief that “someone is listening” can be of great benefit to an organization or government because just having that perception that good is being done with a tap is enough to give people a “warm glow” that they’re helping without having to actually get involved with money or time.

The best way to do this is to ensure that the outcome of the campaign that hashtivists are aligning against is pre-determined, to provide a “honey-pot” campaign that ties up the efforts of organizers, or to simply inundate social media with so many false positives that their involvement becomes muddied and meaningless. In any case, the participants may either feel that they accomplished their goals, or they will simply “fire and forget their ire” as they return to their binge viewing and meme generation.

New! Scratch Pad Blog

Same blog; different purpose (kinda).

I don’t usually write stuff here. This is my “non-gaming blog” which is supposed to be for other…non-gaming related topics. It’s not that I rarely have thoughts that are non-gaming related, but I don’t normally want to write about them. Some topics I think about I just don’t want to discuss, which means that this space is usually and intentionally left blank.

So with Nanowrimo coming back around, and since it’s something I have always wanted to do , I figured that I should keep some notes on stuff to write about…somewhere. Having a repository around that I can access from almost anywhere is difficult because while there’s tools like Google Docs that are web-ized, I don’t like their linear flow. The blog format is actually a good one for me, because I can format and organize thoughts by category and tag for future reference

So you’ll hopefully see more posts coming up through here. I’ve created a category called “Writing” which is dedicated to writing of all kinds, whether is for Nanowrimo or other projects. Subcategories will be assigned to keep it genre-organized, although not necessarily by specific project.

Now, these kinds of posts may seem like they’re being posted as regular blog posts. In that light, some of them may be viewed as “controversial” or that I’m espousing a specific position or agenda. If a post shows up under the Writing category, it’s a brain dump for a particular genre or projectI don’t want this to turn into a magnet for argument, nor do I want people to think that I’m represented by anything under the Writing category or any subcategories. People write racist characters without being racist. People write about religion without being religious. This category is for stuff that I might want to set down to use later, and that’s all it is. Feel free to comment, because some ideas may benefit from a massaging, but anyone who decides to “set me straight” on a fictional note-card is going to get banned hard, and then shamed on social networking.

Math Is Why America Is So Fat

So I went to my annual physical (which I hadn’t had since 2011) and I was told that I needed to lose weight if I didn’t want to die of a heart attack at some point in my life (my take-away message). I mean, anyone can die of a heart attack, so the best we can do is manage our chances to the best of our ability. One way to do that is to manage weight. This involves “exercise and eating right” which is what you hear every single health and fitness guru and commercial tell you.

What the hell does “exercise and eating right” mean, exactly? Exercise is pretty simple: get off your fat ass. I bought a FitBit, which has been serving as a totem to remind me to get up and move (I sit at a desk all day). I have been walking the dog in the morning before work, getting outside to do walking laps at work twice per day, and have been using the dust collecto…elliptical machine…that we have at home. FitBit wants me to hit 10,000 steps per day, so FitBit can go fuck itself because even with the regimen listed above (which I can achieve because I’m at work and need to get the hell outside), I’m not going to hit 10,000 — close, but no granola bar. Still, the 9000+ steps I do manage is about 8975 more steps than I was taking previously, so I’m pretty focused on the exercise part.

The eating part, though…that’s a lot tougher, but not entirely for the reasons you’d think. One thing my doctor said that resonated with me was “portion control”. Here in the U.S., we’re blasted for eating unhealthy foods, but what really gets us, I think, is that we’re given so damn much food. We like to eat food because it tastes good, and because we’re on the tail end of the generation that was raised to clear it’s plate before we could leave the dinner table. That’s a pretty bad combo right there, because we end up eating way more than we really should. We’re given more, it tastes good, and we don’t want to feel that we’re wasting food. If you’re a parent, it’s worse when you eat what your kids don’t manage to finish.

Still, our bodies require a certain amount of energy in the form of calories, based on our age, height, current weight, and other voodoo. If we don’t satisfy this need, we go into ketosis which is a scientific term for tapping the fat reserves, although you’ll hear fitness-terrorists refer to it as “starvation mode” because technically it’s what your body does — naturally — if you aren’t meeting your caloric needs.

So we’re supposed to essentially burn more calories than we take in, and that’s “losing weight”. There’s a lot of other aspects of “eating healthy” like fat and sodium and protein, but as far as losing weight goes, it makes sense that for all the calories we take in, we need to expunge an equal or greater amount through exercise.

Which leads me to this:

WTH_Fitness

See, this is my stats from the FitBit site, circa 2:30 PM today. So far, I’ve had breakfast, lunch, some snack, a coffee, and a lot of water. I’ve also walked the dog and completed two circuits around the office park. I’ve burned 1920 calories, and have taken in 1075 calories. Technically, I’m doing well per the logical assessment of how we’re supposed to lose weight, right?

No. Because I’ve got this nagging at me (from MyFitnessPal.com):

WTH_ExtraCalories

Look at that asterisk. I’ve “earned” an extra 490 calories. Out of the aether, I have been granted some kind of cosmic dispensation to take in another 490 calories.

Hold up: I need 2700 calories according to the Mayo Clinic. The meal plan that FitBit has me on wants me to take in about 2225 calories per day to reach my weight loss goal of 14 pounds (1 stone for my international readers). I’ve burned 1920 calories so far today, and have eaten 1075 calories so far today. By my math — fucking math — I need to eat 1380 calories to reach that 2225.

Is that right? I have no fucking clue. FitBit gives me a vague bunch of numbers and graphs in fancy “Web 2.0″ fashion. Another dashboard tells me I can eat another 1150 calories. Add that to what I’ve eaten today and you get 2225, which is on target for my plan. But if I eat 1150 to reach the 2225, my total day’s caloric intake exceeds my current caloric burn of 1920. Making maters worse (in my mind) is that this 1150 they tell me I can still eat is a shifting goal based on my daily need and what I burn. If I hit the elliptical when I get home, I’ll burn more calories, and that 1150 need will increase.

If I’m understanding this correctly, and omitting considerations surrounding other elements (fat, sodium, et al.) I’ll need to actually eat more the more exercise I do. Where in the manual is this actually explained? Nowhere, that’s where. It’s totally counter-intuitive. The platitude of “eat well and exercise” is about as reductionist as this subject can get, and considering the amount of math and sliding scales that exist under the covers, I can totally understand why people may start on the fitness trail and get quickly derailed. It’s also why nutritionist and fitness coaches have to have degrees and certifications.

The best I can do at this point is rely totally on these charts and graphs, logging every food and recording every exercise. As I go, I’ll see what I’m taking in and what I’m getting rid of, and hopefully get a better feel for the rhythm of weight loss. But right now, the numbers and their esoteric calculations are throwing me off. Fucking math.

The Shame Of Losing And The Cult Of Winning

Here in the West, specifically in the U.S., we value winning over pretty much anything. In any contest — sports, academic, military, and even social situations — the trajectory of progress is linear: keep your eyes on the goal, full steam ahead, and don’t let anything get in your way.

That’s what competition is about, after all. Why play if you’re not out to win? Why would you pay money to see a movie if you just plan on falling asleep? Winning at something isn’t really at issue here. Winning, coming out ahead, achieving first place…all inherently noble goals that under perfect conditions push us to do our very best and, failing that, make us want to learn more, train harder, and try again.

Trying again isn’t always an option, though, and that’s the problem. Our culture is so winning-oriented that we have effectively removed all benefit from failure. It’s become a dirty word, and a mark of shame. “You failed”. “You are a failure”. It’s one of the worst sitgma a person has to live with in modern Western society.

On one hand, we lionize winning. Our culture is seeped in messages that winning is everything: “win big or go home”. “Second place is first loser”. All sporting equipment is sold with the promise that it’ll catapult you into the winner’s circle. Watch any championship broadcast and you’ll see orchestrated images of happy winners and dejected losers. Even in the niche realm of PC components aimed at video game enthusiasts, you’ll see ads from manufacturers extorting how their products will allow you to “dominate” and “destroy your competition”.

Failure, then, is no longer defined as the position earned when the other guy did better than you. It’s now viewed as not having measured up, or that you weren’t good enough. Losers are shamed in this environment; it’s not even good enough to win. The amount of accolades a winner receives is directly related to how brutally they bury their opponent. The goal isn’t just to compete, but to brutally massacre the competition to the point where they can’t even rise again to demand a rematch.

It’d be one thing is we were just talking about sports here. After all, we’re a species that figured that putting guys with swords in an arena qualified as a “sport”, so in the Big Picture, creative camera work that highlights the happy winners and weeping losers is pretty benign. Here in the West, when winning means everything, it manages to infiltrate all kinds of places where there shouldn’t be any competition, and where there normally is, it elevates that competition to the level of a bloodsport.

The biggest ramification that I see is that it drives people apart. Everything becomes about winning, and about being right. It means that we can’t have discussions on important topics because each of us has closely held beliefs that we need to defend at all costs. Any potential point of view that could alter our personal world view would prove not that we were not right isn’t seen as an opportunity to expand our world view, but that we lost an argument and that we were wrong.

Being wrong is just as bad as losing in modern society, and the only way we can “be wrong” is if someone else is “right”, and only if both parties (if not more) are aware of it. That results in a social showdown in which one person gets to do a victory dance while the other looks foolish. On the Internet, this is magnified exponentially, and it never ever goes away. Our loss becomes institutionalized in Google’s cached page system, on Facebook, or other social network. So people do everything they can to minimize their chances of looking foolish and being branded a loser by not engaging in discussion, or, if they are pulled into it (willfully or not), the fangs come out and it’s a take-no-prisoners brawl which won’t end until one participant stomps the other into the virtual dirt.

So what are we really losing by demonizing losing? In an ideal world, the outcome of a competition isn’t the extreme polar opposite of winners and losers. It’s most honest representation is a sprint: two runners on parallel tracks, neck and neck, until one pulls ahead of the other. The loser didn’t lose because he or she wasn’t good enough; they lost because the winner was just a bit better. And there’s nothing that says that winning erases poor performance early in the game. Sometimes winning is done in the last moments of the competition, in a “come from behind” style victory we always appreciate. The point is, a winner is only the person who pushed ahead at the last minute. Before that, there’s no guarantee that the guy who’s ahead will win, or the guy who’s behind will lose.

The main benefit of losing is that we get to learn from our mistakes. In sports, performance is a big deal, and athletes take it seriously. They review hours and hours of past performance for both themselves and their competition. They learn from what they did wrong, and what their opponents did wrong, and they try and do better. This is what we miss out on when losing is equated with shame, and when the purpose of winning is to destroy the opposition so that they can’t come back and try again.

Outside of sports, though, one thing that not allowing dignity in losing is honesty. People will go to great lengths to cover the shame of losing by redirecting blame, or doubling their efforts to find an equally or more devastating attack on their opponent that will turn the tides. We aren’t allowed to own up to our mistakes because it makes us look weak and imperfect. When trying to project a persona (especially online to impress, or in politics), we can’t have any flaws. We have an idea that people will only accept us as superhuman constructs that can do no wrong. On the other hand, we’re horrified when we find out that these personas are actually human after all, as if we didn’t consciously know that already.

Most of the arguing on the Internet comes from this unfortunate situation. Being right is valued so much that being wrong is treated like a crime simply so the “winner” can feel superior and appear intelligent in front of strangers in an attempt to gain a virtual pat on the back and acknowledgement that they’re someone with good ideas and above-average intellect. By punishing the loser in a public forum, the winner shows that he’s someone you don’t want to mess with when it comes to arguing on the Internet, because he’ll destroy you and make you look stupid. It’s the modern day equivalent of kicking sand in someone’s face at the beach.

When Video Attacks

I just received a customer satisfaction survey via email for the company that sold me the second iPhone replacement screen that I bought from them. The survey was very short and straightforward, consisting of clicking stars for “how satisfied I was” and “would recommend to others”, but at the end, they offered me the opportunity to leave some feedback in two forms. The first was a text box, and the second was a video testimonial. Their Flash widget would access my webcam (if I allowed it) and I could record myself hopefully showing the fine item I purchased, and gushing about how I loved the company that sold it to me.

Video is getting to be extremely annoying to me. I spend a lot of time on the Internet (it’s part of my job), and so a lot of that time is spent searching the Internet. Increasingly, I’m finding search results that lead to YouTube or other video service hosts. Technically, these results are 100% valid, but I’m finding that there are videos being recommended for the stupidest and most minor results imaginable. More often than not, my search could be answered in a single paragraph, so why should I have to sit through some amateur videographer stumbling through an explanation when a few words would have sufficed, and have been quicker and easier to index for the future?

Among my online circles, many people will use services like Twitch or Hitbox to stream video game play (AKA “Let’s Play”), but just as many in the same circles pop up like gophers to ask “why watch someone else play a game when you could play it yourself?” That’s a very valid question, and it’s one that is going to need a solid answer soon because game streaming is taking off at an exponential rate. A lot of games have it built in now, and both the Xbox One and PlayStation 4 have Twitch streaming included at the OS level.

Video has it’s place, but there’s a pretty large hump that a producer needs to get over before is should be considered as the first response in any situation. If all someone is going to do is stream exactly what they’re doing, the way they’re doing it, with no additional value added, then they might as well not do it at all. But far, far beyond that, people who can produce a slick video or stream, who can keep it interesting, and who can bring something more to the table than a paragraph or blog post could bring are going to be worth watching. In the case of watching gaming streams, I thought about it as the difference between watching the Super Bowl (exciting!), watching Little League baseball (notice how many parents are talking among themselves?), and getting your friends together to actually play a game of basketball (at my age, WAY more trouble than it’s worth). Good production values equals good view-ability, but only if you’re looking to put on a show. Doing it just because it’s a thing isn’t going to net any benefits, otherwise.

Twitter – A Rant

My “official” foray into social media began with Twitter. I don’t count my early blogging days because that was back before these push-button blogs existed, and the only way we had to really advertise was through web-rings. There really weren’t enough people writing or looking at blogs back then to consider them “social”, or even a network.

I met a lot of outstanding folks through Twitter back then, and the platform used to serve me well, until the lot of us realized that our interactions were hampered by the 140 character limit. We were beyond merely “interacting” and had crossed well into the realm of “discussion” on a regular basis, so we moved first to Google Buzz, and then Google Plus.

I’ve maintained my Twitter account for a few reasons. First and foremost, it’s fast. A lot of information comes to me through Twitter, and often times before I can find anything on any news website (makes verification difficult, but that’s the Internet for you!). Secondly, official accounts, when used correctly, can provide a wealth of information and customer service.

Recently, however, I’ve felt that Twitter is a crowd. Not a party, or a gathering, or a mob, but a crowd. Well behaved, for the most part, but when you stand in a crowd in the real world you’re in the middle of a bunch of cells, the majority of which have nothing to do with you, which don’t interact with you, and which you have no reason to interact with. Watching my Twitter stream flow by, I feel that I’m in the middle of other people’s conversations, and that’s not very useful to me. Twitter has taught me that it’s OK to inject oneself into an ongoing Twitter conversation, but not all conversations are worth jumping into. It seems that most of the conversations I see on Twitter these days are like this. Some are worse; some are just circle-jerks in which people will re-Tweet any Tweet in which they’re mentioned, or will insist on including the same people by name in their Tweets, creating a self-sustaining in-joke. Ideally, Twitter is meant to be open; a broadcast platform first, a direct address second, yet some folks use it like an old-fashioned party line in which they ignore the fact that there are actually other people outside of their little sewing circle.

One of my pet peeves is that Twitter is only useful when it actually conveys information. Some people, for one reason or another, are purposefully ambiguous, or are merely forgetful when passing along news. Vague allusions to potentially interesting or important news stories without a link to a source is the absolute worst transgression of this type: “I’ve got something to say about something, but I won’t provide you with the foundation that makes my point actually meaningful.” Sure, Twitter can field a lot of repeat information, but assuming that everyone who follows you already knows is just dumb; as fast as information moves, it doesn’t always move at the same speed in every direction.

Why don’t I quit the platform, then? I probably will, at least of any humans. It’s ironic that I feel that I get more benefit from following companies and brands than I do individuals. Like I said, any conversation of worth happens on G+, and if anything, Twitter is becoming more and more pithy, like Facebook and it’s user’s “begging for attention” posts.

Holiday Traditions and Traditional Holidays

I enjoyed Pyschochild’s post about “The Meaning Of Holidays”, earlier this week. Holidays are kind of weird to me; although there are observable holidays throughout the year which both require and don’t require a “buy in” (Easter: yes; Arbor Day: no), I don’t really get into a “holiday mode” quite like I do when Fall rolls around. It’s the time of year where a lot of big holidays drive bumper-to-bumper, and if you fit certain configurations, you never really stop observing from October to the start of January.

At the risk of too much navel-gazing, I want to know the why behind this seasonal switch. Holidays are at least days on the calendar, and many people simply observe them as such. I can’t imagine what it’s like for those who truly don’t celebrate any of the year’s end holidays here in the West, since we’re practically drowning in trappings absolutely everywhere; I suppose if one wanted to be cynical, it would be easy to justify a “bah, humbug” on the whole thing. But as someone who isn’t like that, the immersion of the Holidays (with a capital “H”) is part of the allure.

A Brief History of The Past

Let’s be frank: when we say “Holidays”, we’re including Halloween and Thanksgiving (here in the West) as a courtesy. We’re really focused on Christmas and Hanukkah. For the purpose of this monologue, though, I’m talking about Christmas (apologies to my Jewish reader), since it’s the one I celebrate.

Christmas is a good holiday because despite the attempts of those to nail it down to one thing and one thing only, it’s many thing to many people. For some, it’s one of the Ultra Religious holidays. For others, it’s about togetherness that doesn’t need a religious reason. It’s one holiday where everyone is right, and no one is wrong; we get out of it what we want to get out of it, and really that’s kind of the point. No matter how the holiday started,  Christmas is always a “modern holiday”.

Or is it? I read somewhere recently a criticism of how we’re observing Christmas. Specifically, the author stated that we’re not observing a “modern holiday”, we’re observing a “Baby Boomer’s holiday” by allowing the celebrations of the early 20th century to color how we celebrate today. On one hand, I guess he/she is correct, because I instantly understood what he/she meant. On the other hand, I think it’s a short-sighted claim.

Ghosts of Christmas Past

In my view, a lot of what we consider in a non-religious, “traditional” Christmas comes from, or is about, life between 1940-something to 1950-something. The Big Christmas Movies like It’s a Wonderful Life and White Christmas were made in those eras. A lot of the holiday “comfort music” we  have is sung by Bing Crosby and Frank Sinatra, also big during those eras. Even one of the more popular modern holiday films — A Christmas Story — takes place in the 50’s.

Ghosts of Christmas Present

I’m not a fan of “newer” holiday stuff. I think the last decent holiday movie to be made was probably National Lampoon’s Christmas Vacation, or maybe Scrooged (in the 80’s). I can tolerate Michael Buble, but I want to club Mariah Carey with a 30 pound candy cane. And no good “new” Christmas music as been written. Adult Christmas Wish can suck it. Hard.

Whys and Wherefores

I’m only 40 years old. I wasn’t alive when the “classic” vision of Christmas season was actually the present, and yet I dislike anything that was created for the holiday in the past 30 or so years (generally).

I figure it this way.

We have an unabashed “feel good” vibe in the elder Christmas fare, thanks to World War II. After so much wartime horror, the first Christmas back home must have been the most wonderful thing ever: reuniting with family and friends that no one thought would be seen again. Remembering those who were lost. Being thankful that those who returned from the war returned alive. It was probably amazingly optimistic at that point, and if you’re not concerned with the religious angle, it’s about as close to the “meaning of the season” as you can get: Enjoy, and be thankful for the people around you.

In modern times…well, it sounds like a broken record, but we’ve both lost that honest, traditional feeling while fetishizing it at the same time. Almost every ad or commercial in print or on TV this time of year features imagery of “traditional style” holidays with families eating a festive dinner, or of welcoming friends and family into the home. It doesn’t take a cynic to understand that these ads are attempting to bridge the traditional sense of family and togetherness with how good it is to buy stuff. Outside of commercials, though, we’re also narrowly focused on bitching about the shopping season creep, or whether or not it’s appropriate for municipal grounds to sport a manger.  When it’s generally understood that every shopping outlet is a death-sport arena on Black Friday, is it really a wonder we look back to the days when people enjoyed the holiday in a more honest fashion?

That’s not to say that we here in 2013 can’t enjoy the holiday in an honest fashion; it’s just that I don’t believe that our honest feelings about it are rooted in our own lifetimes. No doubt we have fond memories of Christmas as kids, but as an adult, I really find that modern approaches are lacking in anything worth incorporating into my seasonal outlook. New Christmas songs aren’t about the holiday or the season like White Christmas or Jingle Bells. They’re about interpersonal relationships, and ham-handed attempts to shame us into remembering our humanity. None of them really address The Holiday itself; they’re all as narcissistic as any new song is any other day of the year. Same with new holiday movies (most of which are made for TV by those middle-of-the-road networks like Hallmark or ABC Family).

I may not have grown up in the 40’s and 50’s, but my parents did. Their Christmas was the “traditional” Christmas we’re talking about here, and so it became my traditional Christmas through them. It’s where I feel comfortable, so naturally it’s becoming my daughter’s traditional Christmas, through me. In a way, we are living the “Baby Boomer” vision of the holiday, but it’s partly out of nurture, and not because we view it as intrinsically superior (although in light of my low opinion of modern output for the season, I offer that as a vague generality and not a personal affectation). I have no problem with it; it’s still my holiday as much as anyone else’s. No one owns it, and although I don’t have the same reasons or the same intense source as folks did in the 40’s and 50’s, the feelings are a lot stronger, and a lot more comforting, than what I feel I can get from a more “modern” interpretation.

How long will this go on? How many more generations will Bing Crosby last as a cornerstone of Christmas? Maybe not forever, which is why I think the unnamed author who accused us of “celebrating someone else’s holiday” isn’t seeing the forest for the trees. We’re not so far away from the Christmas of our ancestors that we can begin to take comfort in images of adults fist-fighting over the last toy in Wal-Mart as the “true meaning of Christmas”. Many of us grew up with those who experienced the Boomer’s Christmas first hand, and like any generational shift, moving away from that will probably happen gradually as each subsequent generation takes parts of what came before it, and what it creates on it’s own, until the oldest parts of tradition have been marginalized to the atomic level. At some point, I would expect that people will prefer The Santa Claus over Miracle on 34th Street, but I really hope I’m dead by that time, because I don’t think I’d want to be around when that happens.

Then And Now: A Social Retrospective For Dummies

There was this guy, Adam Orth, who was a creative director at Microsoft, and who stirred up a lot of ire by mincing no words when discussing people’s irritation at the Xbox One being “always on, always connected”. As the Internet is wont to do, people took it personally, and worked quickly to make Orth’s life a living hell (according to him).

I could really care less if some random internet dude tells me to “deal with it” in hash tag form. Yes, I pictured this guy as Any Guy, saying this out loud with a “meh” expression and a shrug of the shoulders like a total douchebag, but let’s face it: my opinion is always going to be valid as far as I’m concerned, and that this one guy who’s on the other side of the country, whom I have never met, thinks otherwise makes absolutely no difference in my life. That’s not an attempt to convince myself when my feelings are hurt; I had totally forgotten this guy about 10 minutes after I had originally heard about him.

But his re-surfacing got me thinking, as old people do, about Days Gone By (in this case, pre-Internet). I lived during that time, so this isn’t some half-assed co-opt of an “up hill, both ways” story. Back then, our socializing was limited to only those within arms reach, either through school, clubs, sports, religious institutions, family, or neighbors. The World was a map, or what we heard on TV news or read in the newspapers. Most people (in the U.S., and especially where I grew up) didn’t know anyone on the other side of the world; it might as well have been the 1300’s, before people were really sailing all over the place and meeting new people. The most international I ever got was when my cousins hosted an exchange student from Spain.

Back in those days, you had two choices when dealing with other people. You could totally bullshit people by acting and behaving in a manner that represented who you wanted to be, or you could act like yourself. Usually people chose the first option if they thought their real selves wouldn’t be accepted. But that could really drain your batteries that way, because you had to be “on” 24/7. Remember, your interactions were spatially limited, so if you dropped your guard and someone found out that you were a racist and not a choir-boy, for example, news got around fast. Your entire reputation went from “clean cut” to “bigoted liar” in only a few hours. And you couldn’t get away unless you moved.

Here in modern times, people take for granted the fact that on the Internet, nationality or location in the world is almost meaningless. You can interact with people anywhere, any time, and I think we’ve quickly become immune to the “gee whiz” of it all, especially those who grow up in this environment and know no different.

But as the Orth Parable teaches us, we no longer have the option to choose between throwing up a facade or being ourselves. The freedom that the Internet provides for our benefit is the same freedom that allows people to gang up on one another, to find and publish someone’s home address, the names of family members, the location of children’s schools, a person’s religious and political affiliations, and all kinds of information that isn’t horrible by itself, but in the wrong (and determined) hands, could ignite some Really Bad Shit.

Orth chose to show his true self. He spoke his mind, based on his beliefs that the things people were upset about weren’t worth getting upset about, and that people weren’t seeing the forest for the trees, and were overreacting because of it. But he shot from the hip, and without the benefit of body language or vocal inflection, his comments came off as condescending and arrogant. He wasn’t talking to anyone specifically; he was addressing a nebulous “They”, which included anyone who felt that his comments were addressed directly at them. The Internet being what it is took this slight and stretched it, magnified it, blew it out of proportion, and passed it around until people did what anonymous people will do: they made it as personal for Orth as they felt he had made it for them.

Orth was an idiot. For any intelligent person spending 10 minutes or less on the Internet, it’s pretty obvious that if you’re going to be yourself, you had better be ready for the repercussions. Know this: there are people who are ready for that battle. The rest of us should know that if want to really enjoy our time on the Internet, we have to be who we want to be, not who we are.

Let’s face it: everyone does and says stupid things, and everyone had opinions that other people would find unappealing. There’s no denying that. Back When, if you said something stupid, it would only be stupid if the people in your immediate area thought it was stupid. In the Internet Age, you can say even the most innocuous thing, but it’ll have a world-wide reach in a matter of seconds, and it’ll linger for weeks, months, or years. Someone, somewhere, will find what you said and will call you and idiot for having said it. So we have two options: stay off the Internet, or present a deliberate and cultivated persona designed to provide a little ambiguity as possible regarding your intent, your stance, and your future interaction with people.

Orth had a job to do, and he blew it. He chose to be himself when he should have been Xbox One’s Creative Director. I could write another screed about the disdain that corporations have for consumers as a way to explain how Orth actually was speaking as a Creative Director, but I think this was a case of one man acting alone. His follow-up interview shows that he’s no less clueless about how the Internet works now than he was when he was working for Microsoft. He doesn’t seem to understand that he was just as much to blame by not realizing what kind of a potential shit-storm his off-handed remarks could start. He continues to be dismissive of the people he should have once worked very, very hard to court, even after this debacle caused him apparent hardship. Had he been a model Creative Director, he would have worked hard — probably to no avail — to sell people on the status quo, not tell them to basically fuck off and suck it up. 

That someone who is allowed to speak in public on behalf of another (or a company or brand) can be so clueless about how to comport oneself on the Internet is mind-blowing to me. This kind of behavior would have gone totally unchallenged 25 years ago, but the reality of it is that we can’t just assume that people know us, understand us, or that our words won’t have repercussions somewhere in the world, and then feign indignation when the backlash hits us.

Microsoft Surface Pro

As some folks know (probably the same 8 people who have read this blog), I picked up a Microsoft Surface Pro (128GB) yesterday. After my Nexus shattered (it would cost as much to buy a new one as it would to have Asus repair it), I was tablet-less, adrift in a sea of potential situations where my phone is out of reach, and when I knew something was happening somewhere…but what?

Joking aside, here’s a run-down.

What’s in the Box?

I didn’t take pictures, but there’s a power cable in two parts (power connector is proprietary, which blows), the tablet, the stylus, and a manual.

Physical Presence

The Surface is pretty hefty. I haven’t weighed it, but I’d say it’s about as hefty as Game of Thrones in hardcover. It’s also not svelt. I’d say it’s more akin to the first generation iPad than the current generation iPad. I realize that there’s a contingent out there for whom this will be a problem, but we’ll get to that.

The “VaporMg” case is…OK? I guess? The built-in kickstand is great, but it doesn’t make that cool sucking-sound that it did on stage in presentations. I was kind of disappointed by that. Normally, these devices aren’t very “user-maintenance friendly”, but I think this one takes the cake. Along the edge there’s a series of vents that allow the innards to expel heat, which isn’t something you think about a lot on a tablet, but we’ll get to that also.

There are a few ports and buttons around the edge. The top has a power button and a mic. The right side has headphones, volume rocker, and USB port. The left side has a MicroSD slot, power connector, and a port for external video connections. The power port is elongated, and has a series of magnetic connectors. The power doesn’t snap in physically; it’s just magnetically held there, but it’s a powerful hold. When not charging, the stylus’s rocker buttons (if you know Wacom stylus design) serve as a magnetic male to the female port. I wouldn’t trust the stylus to remain connected during a vigorous trip in a backpack, but it sure beats having the stylus loose on a messy desk. The bottom is given over to the keyboard connector. Again, another really powerful magnet keeps it in place. This time, it DOES make that satisfying sound when connected.

Turn It On

If you’ve used Windows 8 on a desktop system, then there is no difference in presentation between what you get here and what you get on the desk. Except you can smear fingerprints on this screen and have something to show for it. I showed it to a co-worker, and he made one swipe of the Modern UI before professing that he could already see that Windows 8 really does best on a touch device. Beyond that, I won’t review Windows 8. Short answer: I’ve used it with real effort, and I like it.

The screen is pretty bright. The glass was ultra-shiny when I unboxed it, and I debated whether or not to touch it (hint: I did) and foul the fine finish with my human-grease. The sound was just OK; Better than what you’d get out of most tablets, I think, but it’s not very loud. I watched a video last night, and I had to crank the both the Windows and the player’s volumes up to max to hear it. It does have Bluetooth, so I can connect my headset to it.

The resolution is 1900 X 1280, which is what is “standard” for PC’s these days. But I installed a game (Prison Architect) and it couldn’t handle the screen. I was unable to get it to fit properly. But I switched to a 1900×1280 wallpaper, and it fit perfectly.

Performance

It’s fast. There was a lot of talk about Surface RT being sluggish and all that, but I can’t speak to that. Swiping on the Pro is instant and gratifying. Sometimes a bit too instant. I’ve occasionally had to chase tiles around as the screen moved under my timid finger. Be direct. Be forceful. Stab that icon like you mean it!

The big sell for me was the stylus (no matter what St. Jobs claims). I’ve always wanted to get rid of paper: it’s transient, and uncategorizable without additional filing systems. Electronic note taking is great, but adding the layer of handwritten notes and drawings, and it’s basically all you could ask for. I still mourn the  assassination of the Courier (moment of silence…), but so far, the stylus is awesome. The digitizer was designed by Wacom, so it’s got pedigree, and while there’s still a delay between stroke and cursor, the fine tip of the stylus puts those marshmallow stylus poseurs on other tablets to shame. I can take a page of notes in OneNote, sync it to my SkyDrive, and review it on my PC. It’s my organizational Nirvana.

GAMES!

I actually haven’t gotten this far, would you believe? I did install Steam (Suck it, Newell!), though. As mentioned above, I tried Prison Architect with disastrous results, but it’s an indie game in alpha, so I didn’t expect much. This morning, I installed Civilization V because I was reminded that it had touch-screen controls. I fired it up and (after downloading the .NET framework) it had an option to run win Windows 8 mode with touch controls. The game seemed to run well; I was at work, and didn’t get to really PLAY the game, but I’ll check in with it later.

Aside from that, there’s whatever is on the Marketplace, which is to say “almost nothing”. But I have hope: Unity just released update 4.2 the other day, which has FREE support for porting to Windows 8 devices. Assuming it’s not too much work, I hope developers will flip that switch in their existing Unity games to get a piece of the Marketplace before it becomes a dumping ground like those other app stores.

Keyboard

I picked up the Typing keyboard, not the membrane-style Touch keyboard. It’s not tiny, and it’s not full-sized, so the placement of the hands is off. But it’s really nice. It comes with a built-in trackpad because, yes, despite being a touch-centric device, you can use a mouse pointer. The underside of the keyboard is a non-slip felt. No logo, no leatherette material. It’s pretty weak as a fashionable cover, but it’s a keyboard. Cut it some slack! And it protects the screen when not in use.

Problems?

I need to use it more to say for certain, but these come to mind.

Battery! At full charge, the meter says about 3 hours. That was in “performance mode”. Turning off the wifi, setting the power saver mode to something more conservative, remembering to put it to sleep instead of letting it time out…those measures should help, but this is not a marathon-use device on battery.

Proprietary power! EVERYTHING in my house uses micro USB connectors, except for the 3DS and this. That means I have to buy more power cables to have them where I need them, and to avoid having to pack up the power everywhere I go.

Survivability! I’ve never really been a “screen protector” kind of guy, but I’m deathly afraid for this device, mainly because it’s nature demands that it move about a lot, and also because of it’s price.

Fairy Fingers! I actually had it easier on the desktop than with touch when it came to organizing the Modern UI. Deleting and moving tiles is an exercise in patience, as you have to move the tile just a little bit before you can unsnap or delete it. And I still ca’t figure out how to delete pages in OneNote without resorting to the trackpad. You need some very small and nimble fingers to do most of this, I assume.

Windows 8! Nah, not really. Just hater-baiting, because this is really where Windows 8 feels right. Sadly, due to the price and entrenched perception, normally open-minded folks who claim to hate Windows 8 will never get to see it in it’s native environment like this.

Here’s the “More On That Later” Section

I was at Best Buy, standing around waiting to catch the eye of a sales person (you’d have a better time finding Bigfoot with your eyes closed in pitch dark in the middle of a forest) and I was checking out other options. I saw the Galaxy…something tablet. It has a stylus as well, and was 1/2 the price of the Surface Pro. There were also laptops, again at a fraction of the price of the Surface Pro. I caught myself thinking “why not just get one of those and save money?”

The reason is because both of those options only did half of what I wanted. The tablet did tablet stuff, but not desktop stuff. The laptops had a physical keyboard attached at all times, which makes touch-screen use difficult. Both were portable, but neither did everything. That was my main criteria, and my reason for going with the Pro.

But wait! The Internet cries. A laptop is more powerful! A tablet doesn’t have that shitty Windows 8 Modern UI! Well, you’re both right. Had I wanted horsepower, I would have gone with a laptop, but I have a desktop already. I couldn’t take notes or draw on a laptop, and it wouldn’t be easy to stand up, walk around, and still use the thing on the go. If I had wanted a consumption device, I would have gone with a tablet. But I’ve owned an iPad and a Nexus. I have owned an iPhone, a few Android phones, and a few Windows phones. I have enough consumption devices in my life right now that I needed a productivity device instead. Trading the full power of each to have both in one package is what I expected, and what I wanted. So I don’t mind that it’s an “underpowered” laptop or a “Windows 8″ tablet.

One thing I’ve noticed over the months since Windows 8 and Surface have been released, any criticism of Surface as a brand have been solely focused on RT, with none of the praise that Pro deserves. I can’t speak to RT, but whenever a blogger on a tech site wanted a punch-line, it was always Surface RT. It would have been really easy for those kinds of people to have their contacts get them a Pro so they could have something worth talking about, but…nothing. It was like a conscious effort to ignore the positive side of the product line.

Pro is a solid piece of hardware that makes a decent home for a solid piece of software. Yes, the price is daunting, putting it out of reach of many who consider price over form and function, which is sad on all counts. Reduce this in price by $300-$500 and I bet you’d be hard pressed to find one on shelves. You can get cheaper laptops; you can get cheaper tablets; you can’t get both in one package for cheap, though. That’s kind of sad, because I think the Pro is “the” device that actually promises a potential death of desktop computing at the hands of mobility, not because it dumbs it down or because it’s portable, but because it’ll do what desktops do, and it’s portable with far less compromise than you get from other devices.

Zen And The Art Of Blogging

Blogging is a weird sport. Many people do it, and many people wouldn’t be caught dead doing it. Of those who do, some treat it like a religion or a workout, while others only bother to post something when they remember that they have a blog. The reasons are varied, and the results are even more varied still. It’s very easy to set up a blog, but it’s very difficult to write something worthwhile.

But everything we write as bloggers is worthwhile! If it weren’t, we wouldn’t bother, right? So why is it that we can write a great post one week and get mediocre traffic, only to see someone else blog something remarkably similar the next week to great acclaim? It’s frustrating, but the old saw is “write for yourself”, and damn the reviews. We write not because we want to be famous, but because we really like to write, and that’s the most important thing.

Well, yes and no. Yes, we write because we love it. Writing will never go away, and thanks to the Internet, we no longer have to write in the vacuum of our own notebooks, which means that no, we don’t blog for ourselves entirely. If we didn’t care about getting feedback, we would just stick with our own notebooks. Despite what any blogger says, there’s some level of need to be read, and it’s very disappointing when that doesn’t happen.

In some ways, blogs are people’s attempts to connect with others. There are blogs about really personal things, about ephemeral things, about hobby things, but we all write about what we know and what we like, and we want to connect with people who know and like the same things. Blogs are our way of opening conversations with a much wider audience.

I sometimes wonder about people who don’t have blogs, or use social networks or anything like that: what do they do with their thoughts and ideas? Yeah, that’s a horrible “Internet Age” perspective, because people got on with their lives before the Internet and all. The short answer is that “they talk to real people”, meaning people around them: friends, family, co-workers. I wonder if my “online-ness” supersedes my ability or desire to deal with people.