John Siracusa’s OS X reviews


Today John Siracusa announced that he won’t be making more OS X reviews.

Typically journalists or reviewers don’t announce that they stop doing something. They just stop doing it, and maybe explain it after someone asks them.

But John’s reviews were something truly special, and a lot of people on the tech world has lamented the announcement.

I think that the Mac community has always been quite vibrant and passionate, allowing detailed discussion. Crossing the line of obsession sometimes. In other tech worlds, the discussion is more cold and rational, even aseptic. Apple discussion has always been more emotional and sort of aiming for greatness. Years ago the distinction was quite pronounced, when Apple had a small and passionate group of followers.

Tech reviews tend to be on two extremes. Either too much spec and features enumeration, making them arid and boring. Or too much focused in fitting a narrative, talking about whether something is the best or the worst ever.

But these OS X reviews are the perfect combination of both aspects, and they remain the gold standard of tech literature.
John’s reviews are extremely detailed, even to the level of obsession, but yet they are easy to read and understand.
They present a clear exposure of new features. But add historical context of decisions, compromises and forecasting areas of improvement.
Siracusa has strong opinions in quite a few areas, but he’s exposing all the facts, and explaining his biases.

I can’t stress how difficult is keep you hooked and eager to read another 8,000 words, when you’re talking in length about filesystems, pixel alignment, background process handling or Finder performance.

I haven’t found anything that can be on par, though there are great writers writing about technology these days. I can read better reviews now than 5 or 10 years ago.

While I understand his reasons for stopping, and think we have been lucky to have them around for so long, I can’t help but feel a little sad.

I’ll keep following him in his blog, twitter, and of course on his weekly podcast, as well as collaborations. At least, John Siracusa keeps a healthy production that we can consume. I’m sure we’ll keep reading and listening great stuff.

Fascinating


The importance of Spock as an icon of the XX century cannot be overstated.

Spock

Let me go back for a second. When I was a child in Spain, access to Star Trek was pretty limited. There were no reruns of the original series. I think it only was emitted during the 70s in Black and White. The Next Generation was broadcasted with a four or five year delay. And it stopped for several years just at the end of the third season. It took me years to know what happened with Picard and Locutus! Also, the movies were not big hits.

But everyone knew who Spock was. A lot of people told my uncle he looked like “Dr” Spock. He beared a clear resemblance with Leonard Nimoy when he was young.

In a lot of ways I’ve always consider Star Trek my “prime geeky love“. That iconic status and mystery, made me search for it. After having access to Internet, I devour everything about it. It was “my thing“, more than other stuff I love, because it was less known among my circles. I bought the whole DS9 series in VHS from UK, over the course of several years. I say that I learned spoken English through Star Trek. And written English through RPG manuals.

Spock is the most genuine example of “cool scientist“, in opposition from the “mad scientist” trope . Probably the first example. And was the one getting the spotlight in the series. He was the character to follow.

He is iconic to a level that only competes with a bunch of icons. I’ve seen him in posters of parties or t-shirts in big stores. He’s know by people that don’t even know what Star Trek is.

And none of this should been possible without the extraordinary performance by Leonard Nimoy. It is clear that he contributed heavily to a lot of things that define Spock. From the Vulcan Nerve Pinch to the Vulcan salute. But his biggest legacy  is the fact that his performance is replicated constantly. Every single Vulcan in Star Trek is impersonating Leonard Nimoy playing Spock. In tons of other media we have similar interpretations of “emotionless aliens/outsiders“. He created an archetype, which is a huge achievement. That is also a heavy burden, and Leonard Nimoy seemed only to embrace it after years of struggle.

Vulcan_IDICThere is also an overlooked idea in Vulcan philosophy that has always been appealing to me. IDIC. Infinite Diversity in Infinite Combinations. It resonates in me in a lot of diff
erent levels. On the personal level, on how important is to embrace the diversity that us humans are capable of produce.
And to the software creation, on how wonderful is the combination of different code, assembled to produce something magical together.

 

 

Leonard Nimoy will be greatly missed. Rest In Peace.

Leonard Nimoy

Life long and prosper.

Compendium of Wondrous Links vol VIII


wondrous_links

More great reads!

hands-typing

About code creation

office

The job of developing

knowledge_workers_productive

concept artOther stuff

 

 

TV interface


big_remote_control

Is this really the best we can do?

Isn’t it quite absurd that we haven’t nail this yet?

We recently heard about all the great advances in terms of image quality, 4K, bending screens… Yet controlling a TV feels clunky and awkward.

Even worse, given the never ending increase in devices connected to the TV (DVDs, Blu Rays, AppleTVs, consoles…) the usage of different remote controls is painful, unless a Universal Remote is used. But even in that case, the process itself of selecting the activity is weird, for very common operations.

For example, if you want to play a movie on an AppleTV (or DVD player, or TiVO, etc), you may have to turn it on, then (using the remote), turn on the TV, select the input (which may involve cycle through a lot of unused inputs, like Composite), adjust the volume, and then get the AppleTV controller.

If at any point we need to adjust the volume, we’ll need to get again the TV remote.

All this is very weird and not the best user experience…

So, just for the sake of discussion, let’s say a couple of ideas for what I think could be a TV with a much better interface:

– Everything is a channel. Why HDMI is “an input” but TV channels are numbered? Allowing to select HDMI input as channel 3 allow easy access to it, and simplifies the interface.

If there is no input (e.g. the DVD is off), the channel will be skipped for channel surfing purposes, unless the user specifically request the channel number.

A good remote. I don’t think that there’s any excuse for not being Universal and Programable Remote. The included TV Remote should aim to control more things than just the TV. It should have as few buttons as possible, more on that later. And, another important feature, it should be possible to make it beep if lost (e.g. clicking a button on the TV)

Programmable channels. When a channel is on selected, all the settings (mainly image controls, but may also include a volume adjustment) will change accordingly. The remote will also know that the controls are now related to this channel. For example, if the DVD channel is selected, most of the keys are related to the DVD remote (except volume keys); if the AppleTV channel is selected, the keys refer to the AppleTV controls.

A list of channels (ideally showing their current input) that allows easy rearrangement and enabling disabling them will be great to navigate and give an overview.

chromecast_remote

An interesting (and useful) design, a two-sided remote on the Chromebox. Obviously, not from a TV manufacturer

A companion app that allows to configure all the parameters easily on a PC or mobile app.

I understand that the complexities of the different devices in the living room is complex, and that there are problems that can be impossible to fix (for example, having an HDMI router may still present problems given that more than one “channel” will be coming from the same “input”)

But I find quite ridiculous that we are still dealing with such terrible interfaces in TVs these days…

Compendium of Wondrous Links vol VII


wondrous_links

Here we go again… This time I’m loosely grouping them, it has been a while and there are so many things!

  • An extremely though-provoking article about the possible positive things can have a negative effect. How Perks Can Divide Us. Corporate culture is something extremely complex.
  • Mora about culture and biases on Mirrorcracy.
  • I do not like the idea that people have a “level”, ignoring all the dynamics (I talked about this here). There are no B players.
  • A set of articles talking about management skills. Is a great compilation of the different aspect of management. Great read even if you’re not “an official manager”, as is capital to understand what are the things that a (good) manager should do and challenges that the role presents
  • Some ideas about productivity. I like this quote a lot: “There is no one secret to becoming more productive, but there are hundreds of tactics you can use to get more done”
  • We tend to idealise the work involved in some stuff we really like. In particular in creating video games. It’s okay not to follow your dreams, but I think it could apply to a lot of other aspects

Dragon's Lair

 

 

Future as a developer and the ever changing picture


A few weeks ago I came by a couple of articles my Marco Arment that share the theme of the current status of accelerated change within the development community as a way of stressing up, and being difficult to be up to date. After all, one gets tired of learning a new framework or language every size months. It gets to a point where is not funny or interesting anymore.

It seems like two different options are presented, that are available for developers after some time:

  • Keep up, meaning that you adopt rapidly each new technology
  • Move to other areas, typically management

Both are totally valid options, as I already said in this blog that I don’t like when good developers move to different areas (to me it’s sort of a surgeon deciding she had enough after a few years and move to manage the hospital). Though, obviously each person has absolutely every right to choose their career path.

But I think that it’s all mostly based in an biased and incorrect view of the field of technology and the real pace of changes.

In the last years, there has been an explosion of technologies, in particular for web. Ruby on Rails almost feels introduced at the same time as COBOL. NodeJS seemed to be in fashion for a while. The same with MongoDB or jQuery.

We all know that being stressed is not a great way of learn

We all know that being stressed is not a great way of learn

In the last 6 or 7 years there has been an incredible explosion in terms of open source fragmentation. Probably because GitHub (and other online repos) and the increase in communication through the Internet, the bar to create a web framework and offer it to the world has been lowered so much, that a lot of projects that would’ve been not exposed previously, has gotten more exposure. As a general effect, is positive, but it came with the negative effect that every year there is a revolution in terms of technologies, which forces everyone to catch up and learn the brand new tool that is the best for the current development, increasing the churning of buzz words.

But all this is nothing but an illusion. We developers tend to laugh at the common “minimum 3+ years of experience in Swift”, but we still get the notion that we should be experts in a particular language, DB or framework since day one. Of course, of the one on demand today, or we are just outdated, dinosaurs that should retire.

Software development is a young field, full of young people. That’s great in a lot of aspects, but we need to appreciate experience, even if it comes from using a different technology. It doesn’t look like it, but there’s still a lot of projects done in “not-so-fancy” technologies. That includes really old stuff like Fortran or COBOL, but also C++, Java, Perl, PHP or Ruby.

Technologies gets established by a combination of features, maturity, community and a little luck.  But once they are established, they’re quite resilient and don’t go away easily.  They are useful for quite a long time. Right now it’s not that difficult to pick a tool that is almost guaranteed to be around in the next 10-15 years. Also, most of the real important stuff is totally technology agnostic, things like write clean code, structure, debug ability, communication, team work, transform abstract ideas into concrete implementations, etc… That simply does not go away.

Think about this. iOS development started in 2008. Smartphones are radically different beasts than the ones available 6 years ago, probably the environment that has changed more. The basics are the same, though. And even if Swift has been introduced this year, it’s based in the same principles. Every year there has been tweaks, changing APIs, new functionalities. But the basic ideas are still the same. Today a new web development using LAMP is totally viable. Video games still relay on C++ and OpenGL. Java is still heavily used. I use all the time ideas mainly developed in the 70s like UNIX command line or Vim.

Just because every day we get tons of news about new startups setting up applications on new paradigms, that doesn’t mean that they don’t coexist with “older” technologies.

Of course, there are new tricks to learn, but it’s a day by day additive effort. Real revolution and change of paradigm is rare, and normally not a good sign. Changing from MySQL to PostgreSQL shouldn’t be considered a major change in career. Searching certain stability in the tools you use should be seen as good move.

We developers love to stress the part of learning everyday something new and constantly challenge ourselves, but that should be taken also in perspective with allowing time to breathe. We’ve created a lot of pressure on ourselves in terms of having to be constantly pushing with new ideas, investigating in side projects and devoting ourselves 100% of the time to software. That’s not only not realistic. It’s not good.

You only have to breathe.  And just worry on doing a good work and enjoy learning.

Compendium of Wondrous Links VI


wondrous_links

  • They finally found all those buried Atari cartridges, and confirmed a beloved urban legend. Just wonderful.
  • This episode of @ExtraCreditz follows up an idea I always had about education. The key is being demanding, but allowing a lot of opportunities.
  • Amazing book introduction, showing how no one is immune to think that they are stupid. Lots of things in live are hard.
  • Readability in code is not about being literary. Is about making the code easy to understand. You don’t read code, you explore it.
  • The Great Works of Software. The premise is extremely interesting. What are the most influential pieces of software?
  • The hilarious (is funny because it’s true) Programming Sucks and a follow-up What programming is Like.
  • Is programming a dead end job? I still can’t help but feel sad each time that a (good) developer decides to move into management.
  • It’s easy to forget how much the things have changed in term of software distribution. What Writing and Selling Software Was Like in the 80’s (yep, also from The Codist. You should subscribe)
  • The computer world is very dominated by English, and even so with latin alphabet. This idea about making a computer language in Arabic is fascinating. It not only shows how difficult is to set up an environment without problems out of “the ASCII world” (the magnitude is not comparable, but trying to code in languages like French or Spanish has a lot of friction), but it also shows up how alien (yet beautiful) a different alphabet looks. I wonder how code and programming will be if the dominant language would’ve been something like Chinese or Arabic.
  • What is the “Agile mindset” anyway? The graph is very interesting. Specially the “Chaos labeled as Agile” side.
  • I don’t really like the idea of “rivalry” against Vim and Emacs. I prefer to consider them two valid options. But this article goes into explaining their different appeals and why they have been around since an extremely long time ago in computer-years.
  • 10 Most common Python mistakes. Good to check.