It’s probably the worst possible way to end any paragraph on the planet, because really, even you realized that you were making such little sense that you decided to clarify without having to ask.
Hey twitter. Just so you know before we get started, I think you’re cool. I enjoy the time we spend together, generally speaking. It’s just — Look.
I don’t really know why I check you so often. Most things that I happen to be following during the ‘boring’ parts of the year; You know the kind. No elections, no colossal world event has really hit maximum information saturation that I can count on you to provide when something qualifies. Snowden, I guess? That’s kind of why I have a bit of a beef, but not really because of the NSA security angle.
It’s news, certainly. But it’s not drama, and that’s what you push on your users more than anything else. Although I don’t think that’s really what you’re good at.
When the news broke that there was this guy named Edward Snowden and he did some Very Questionable Things regarding Our Nations Security (quite), baby – you were saturated.
Tweet after tweet of raucous fury and indignation, followed with the echos of the Entertaining Crazies that is inevitable about every big hashtaggable news story.
It’s really irritating you can’t italicize text, twitter. posting and then ! immediately afterwards after a beat is really just not the same as it is in an IM conversation. If the real-time angle was really about making the platform more of an “instant communication tool”, you’ve left out some basics about writing.
I think, as a platform, that subconsciously, you care more about prose and sentence structure than you’re willing to admit.
You have to craft a really great tweet that contrasts the average mental puke of the common twitterer into something that can get hundreds of retweets. Everything in comedy is timing and context, right? Twitter’s really the only large scale platform where I can use their message delivery contract as a way of defining the product better.
That’s cool — because the natural side effect is that you have a big differentiator than any other service that existed when twitter first did.
You posted stuff, but the web was more primitive then. You just refreshed to see new content. Maybe you had a app or client that gave you real time capability, but it was never really a concrete contract of the service. What twitter did was literally opening up your feed and realizing that there was a few tenths of a second difference between you hitting ‘submit’ and it showing up on my screen and the screen of however many people are following you, if they happen to be looking at the time.
There’s really no upper limit on the creativity that twitter provides for the connoisseur. If you buy into the theory that writing within a structure forces you to work within the medium, 140 characters was the best arbitrary decision ever made. You force the really clever people to craft these gems of creative writing. You also let people work in stream of consciousness style writing and, for the first time on the web, it’s actually forced to be concise. Vine followed it to it’s logical conclusion.
I’d like to think @nerdist and company thought like I just described when they were making @midnight. It’s the most clever way of monitizing twitter I’ve seen yet.
edit; I just got “Well, Actually”‘d by @nerdist himself:
— Chris Hardwick (@nerdist) May 11, 2014
Being unproductive sucks. Breaking the cycle can be writing ten lines of code, or forking some interesting project on github and poking at it with a stick. Every developer is different. Find your charger somewhere and go do. Do it every day. Do it with quality in mind.
Tooling needs to be good to feel productive. There’s a sense that you can never really profile well enough. Constant fiddling with data structure variations for Entity and Component patterns. You’re not at a sub 17ms loop, so you spend days figuring out how to hack the inner entity tick loop to optimize away 3,000th of a second to get there.
Unless you’re building something that lets you automatically do what you would normally do manually, you’re not advancing squat.
It doesn’t actually matter if you think it’s cool. Is something worth paying for? I’d rather get an answer to that question first.
Writing Python is like sharing notes on what to do with the computer. It’s hard to make a case for Ruby when I feel like it has to interpret.
If you’re anything like me, you’ve probably heard variations of that sound ringing in your ears, followed swiftly by a shot of dopamine that floods your brain. “Good Job!” you no doubt tell yourself, “Now I can do X even better!”.
Good Job, indeed.
The “Ding” is a great thing, and comes from the great sense of accomplishment that learning a new skill, some new algorithm, or just making-the-numbers-go-up experience.
I still get the “Ding” every once in a while. I’ve been ‘programming’ for probably close to 17 years, and I’ve either dabbled or destroyed a line of code in at least 8 different languages, each with their own unique “Dings”.
– effective use of file descriptors? Ding!
– Stacks, Queues, and Heaps? Ding!
– Memory management in a lower level language (c, c++, objective-c)? Ding!
– When you can immediately name a problem set without setting foot in Stack Overflow? Ding!
– More broadly, Object Oriented Programming? Ding!
The “Ding” is the essential unit of measure for any creative profession. It’s the psychological equivalency of feeding an addiction. It’s how we stay sharp, adapt, and it’s a fountain of self-esteem.
Engineers get hit with this more often than most creatives because outside of direct recognition – Which, lets face it, is pretty rare – it’s our easiest and most straightforward metric that we’re getting better at our jobs.
The “Ding”, however, is a path of personal anguish and ruin. Nothing scars the psyche of an engineer more than a problem without a solution. The hours of 11:00 PM to 3:00 AM are typically when we “Ding”, and shortly afterwards, cruelly, sleep comes and oftentimes with it, the original hit of the “Ding” evaporates as fast as our consciousness does.
If I’m not searching for a “Ding” professionally, I almost feel like I’m wasting my time. “Ding” droughts are as rough as championship droughts for your favorite sports team, a failed start-up, or worse – boredom.
The lack of good “Ding” in the Front End world is forcing me into other domains. Mobile, for one. Games, for another. “Web” technology, outside of the frameworks behind the scenes that power it, feels like it’s growing stagnant.
– Webkit is amazing, but how much more amazing is it now than it was 2 years ago?
– I still have to support Internet Explorer 8, and often 7. If you do any kind of agency work or web application building; don’t lie. You do too, and you hate it.
– The “Mobile Web” problem is largely being solved by the select few who are REALLY talented at responsive design. Kudos to them.
– Apps are not the answer, but at least they’re attempting to innovate.
– Our mobile devices are consuming a larger part of our ‘connected’ attention spans
– How much longer are we going to have to wait before we see ES6? How long until IE updates to conform?
– CSS3 started being supported
I’m craving me some “Ding” but the closest I can get to it is WebGL. The last real “Ding” I had been when my eyes were blown apart by how much time I saved using SCSS/Compass.
Where’s the innovation that the web displayed with aplomb over the ’00s?
– CSS3 3D Transforms were announced back in 2009!
– CSS Animations were originally announced in 2009!
– CSS Cascades and Inheritance was 2005!
– Web Sockets have been around since 2009, though really how much have you been using them in ‘real’ projects? (WITHOUT a shim; that is. Socket.IO is amazing but you’re lying to yourself if you think it’s really using web sockets more than 50% of the time.
You still can’t access the filesystem on a broad range of browsers.
You still can’t natively grab the webcam on a broad range of browsers, for god sakes. (yes, i know all about the Capture API– call me when it’s available on something more than Canary).
WebGL is a pipe dream for mass appeal, especially when it comes to non-US users. 53% of browsers support WebGL. 23% of which are only partial support .
I don’t know where I can realistically go for my “Ding” fix, but it’s certainly not the web now. There’s always a wish to push it as far as it can go– but I work at a job where that kind of luxury is followed by “What’s the browser support look like?”, and then it’s just disappointment.
The thing is, I know that the well isn’t dry. The web will continue to be a great place that’s got a hundred different ways to do the same thing – and that’s wonderful. But it’s not pushing the limits anymore. At least for the not for the last few years. IE8 (and to a lesser but real extent, IE9) are largely the culprits in this mess, I will freely admit.
These things are fantastic as “Ding!” moments for the people involved. But I’m usually left wondering if people are addicted to a steady stream of easily attainable “Ding” moments, and now that the well is running a bit dry from overexposure, stagnation and reinvention are creeping back from the depths of IE5 and 6.
But in reality, I theorize that the people who’ve been working on the web for as long as they knew what it was are realizing that they are catching up to what was possible on other platforms, and not blazing as many trails.
Consistency should be the focus of the next half decade of progress on the web. We’ve had the contest on who’s the better rendering engine for too damn long, and as a direct result of Chromium vs Gecko vs Trident war, a massive percentage of the population are either forced or aren’t educated enough to be using something that’s just as free, and just as easy to use.