Fascinating

The importance of Spock as an icon of the XX century cannot be overstated.

Spock

Let me go back for a second. When I was a child in Spain, access to Star Trek was pretty limited. There were no reruns of the original series. I think it only was emitted during the 70s in Black and White. The Next Generation was broadcasted with a four or five year delay. And it stopped for several years just at the end of the third season. It took me years to know what happened with Picard and Locutus! Also, the movies were not big hits.

But everyone knew who Spock was. A lot of people told my uncle he looked like “Dr” Spock. He beared a clear resemblance with Leonard Nimoy when he was young.

In a lot of ways I’ve always consider Star Trek my “prime geeky love“. That iconic status and mystery, made me search for it. After having access to Internet, I devour everything about it. It was “my thing“, more than other stuff I love, because it was less known among my circles. I bought the whole DS9 series in VHS from UK, over the course of several years. I say that I learned spoken English through Star Trek. And written English through RPG manuals.

Spock is the most genuine example of “cool scientist“, in opposition from the “mad scientist” trope . Probably the first example. And was the one getting the spotlight in the series. He was the character to follow.

He is iconic to a level that only competes with a bunch of icons. I’ve seen him in posters of parties or t-shirts in big stores. He’s know by people that don’t even know what Star Trek is.

And none of this should been possible without the extraordinary performance by Leonard Nimoy. It is clear that he contributed heavily to a lot of things that define Spock. From the Vulcan Nerve Pinch to the Vulcan salute. But his biggest legacy  is the fact that his performance is replicated constantly. Every single Vulcan in Star Trek is impersonating Leonard Nimoy playing Spock. In tons of other media we have similar interpretations of “emotionless aliens/outsiders“. He created an archetype, which is a huge achievement. That is also a heavy burden, and Leonard Nimoy seemed only to embrace it after years of struggle.

Vulcan_IDICThere is also an overlooked idea in Vulcan philosophy that has always been appealing to me. IDIC. Infinite Diversity in Infinite Combinations. It resonates in me in a lot of diff
erent levels. On the personal level, on how important is to embrace the diversity that us humans are capable of produce.
And to the software creation, on how wonderful is the combination of different code, assembled to produce something magical together.

 

 

Leonard Nimoy will be greatly missed. Rest In Peace.

Leonard Nimoy

Life long and prosper.

Compendium of Wondrous Links vol VIII

wondrous_links

More great reads!

hands-typing

About code creation

office

The job of developing

knowledge_workers_productive

concept artOther stuff

 

 

TV interface

big_remote_control
Is this really the best we can do?

Isn’t it quite absurd that we haven’t nail this yet?

We recently heard about all the great advances in terms of image quality, 4K, bending screens… Yet controlling a TV feels clunky and awkward.

Even worse, given the never ending increase in devices connected to the TV (DVDs, Blu Rays, AppleTVs, consoles…) the usage of different remote controls is painful, unless a Universal Remote is used. But even in that case, the process itself of selecting the activity is weird, for very common operations.

For example, if you want to play a movie on an AppleTV (or DVD player, or TiVO, etc), you may have to turn it on, then (using the remote), turn on the TV, select the input (which may involve cycle through a lot of unused inputs, like Composite), adjust the volume, and then get the AppleTV controller.

If at any point we need to adjust the volume, we’ll need to get again the TV remote.

All this is very weird and not the best user experience…

So, just for the sake of discussion, let’s say a couple of ideas for what I think could be a TV with a much better interface:

– Everything is a channel. Why HDMI is “an input” but TV channels are numbered? Allowing to select HDMI input as channel 3 allow easy access to it, and simplifies the interface.

If there is no input (e.g. the DVD is off), the channel will be skipped for channel surfing purposes, unless the user specifically request the channel number.

A good remote. I don’t think that there’s any excuse for not being Universal and Programable Remote. The included TV Remote should aim to control more things than just the TV. It should have as few buttons as possible, more on that later. And, another important feature, it should be possible to make it beep if lost (e.g. clicking a button on the TV)

Programmable channels. When a channel is on selected, all the settings (mainly image controls, but may also include a volume adjustment) will change accordingly. The remote will also know that the controls are now related to this channel. For example, if the DVD channel is selected, most of the keys are related to the DVD remote (except volume keys); if the AppleTV channel is selected, the keys refer to the AppleTV controls.

A list of channels (ideally showing their current input) that allows easy rearrangement and enabling disabling them will be great to navigate and give an overview.

chromecast_remote
An interesting (and useful) design, a two-sided remote on the Chromebox. Obviously, not from a TV manufacturer

A companion app that allows to configure all the parameters easily on a PC or mobile app.

I understand that the complexities of the different devices in the living room is complex, and that there are problems that can be impossible to fix (for example, having an HDMI router may still present problems given that more than one “channel” will be coming from the same “input”)

But I find quite ridiculous that we are still dealing with such terrible interfaces in TVs these days…

Compendium of Wondrous Links vol VII

wondrous_links

Here we go again… This time I’m loosely grouping them, it has been a while and there are so many things!

  • An extremely though-provoking article about the possible positive things can have a negative effect. How Perks Can Divide Us. Corporate culture is something extremely complex.
  • Mora about culture and biases on Mirrorcracy.
  • I do not like the idea that people have a “level”, ignoring all the dynamics (I talked about this here). There are no B players.
  • A set of articles talking about management skills. Is a great compilation of the different aspect of management. Great read even if you’re not “an official manager”, as is capital to understand what are the things that a (good) manager should do and challenges that the role presents
  • Some ideas about productivity. I like this quote a lot: “There is no one secret to becoming more productive, but there are hundreds of tactics you can use to get more done”
  • We tend to idealise the work involved in some stuff we really like. In particular in creating video games. It’s okay not to follow your dreams, but I think it could apply to a lot of other aspects

Dragon's Lair

 

 

Future as a developer and the ever changing picture

A few weeks ago I came by a couple of articles my Marco Arment that share the theme of the current status of accelerated change within the development community as a way of stressing up, and being difficult to be up to date. After all, one gets tired of learning a new framework or language every size months. It gets to a point where is not funny or interesting anymore.

It seems like two different options are presented, that are available for developers after some time:

  • Keep up, meaning that you adopt rapidly each new technology
  • Move to other areas, typically management

Both are totally valid options, as I already said in this blog that I don’t like when good developers move to different areas (to me it’s sort of a surgeon deciding she had enough after a few years and move to manage the hospital). Though, obviously each person has absolutely every right to choose their career path.

But I think that it’s all mostly based in an biased and incorrect view of the field of technology and the real pace of changes.

In the last years, there has been an explosion of technologies, in particular for web. Ruby on Rails almost feels introduced at the same time as COBOL. NodeJS seemed to be in fashion for a while. The same with MongoDB or jQuery.

We all know that being stressed is not a great way of learn
We all know that being stressed is not a great way of learn

In the last 6 or 7 years there has been an incredible explosion in terms of open source fragmentation. Probably because GitHub (and other online repos) and the increase in communication through the Internet, the bar to create a web framework and offer it to the world has been lowered so much, that a lot of projects that would’ve been not exposed previously, has gotten more exposure. As a general effect, is positive, but it came with the negative effect that every year there is a revolution in terms of technologies, which forces everyone to catch up and learn the brand new tool that is the best for the current development, increasing the churning of buzz words.

But all this is nothing but an illusion. We developers tend to laugh at the common “minimum 3+ years of experience in Swift”, but we still get the notion that we should be experts in a particular language, DB or framework since day one. Of course, of the one on demand today, or we are just outdated, dinosaurs that should retire.

Software development is a young field, full of young people. That’s great in a lot of aspects, but we need to appreciate experience, even if it comes from using a different technology. It doesn’t look like it, but there’s still a lot of projects done in “not-so-fancy” technologies. That includes really old stuff like Fortran or COBOL, but also C++, Java, Perl, PHP or Ruby.

Technologies gets established by a combination of features, maturity, community and a little luck.  But once they are established, they’re quite resilient and don’t go away easily.  They are useful for quite a long time. Right now it’s not that difficult to pick a tool that is almost guaranteed to be around in the next 10-15 years. Also, most of the real important stuff is totally technology agnostic, things like write clean code, structure, debug ability, communication, team work, transform abstract ideas into concrete implementations, etc… That simply does not go away.

Think about this. iOS development started in 2008. Smartphones are radically different beasts than the ones available 6 years ago, probably the environment that has changed more. The basics are the same, though. And even if Swift has been introduced this year, it’s based in the same principles. Every year there has been tweaks, changing APIs, new functionalities. But the basic ideas are still the same. Today a new web development using LAMP is totally viable. Video games still relay on C++ and OpenGL. Java is still heavily used. I use all the time ideas mainly developed in the 70s like UNIX command line or Vim.

Just because every day we get tons of news about new startups setting up applications on new paradigms, that doesn’t mean that they don’t coexist with “older” technologies.

Of course, there are new tricks to learn, but it’s a day by day additive effort. Real revolution and change of paradigm is rare, and normally not a good sign. Changing from MySQL to PostgreSQL shouldn’t be considered a major change in career. Searching certain stability in the tools you use should be seen as good move.

We developers love to stress the part of learning everyday something new and constantly challenge ourselves, but that should be taken also in perspective with allowing time to breathe. We’ve created a lot of pressure on ourselves in terms of having to be constantly pushing with new ideas, investigating in side projects and devoting ourselves 100% of the time to software. That’s not only not realistic. It’s not good.

You only have to breathe.  And just worry on doing a good work and enjoy learning.

Compendium of Wondrous Links VI

wondrous_links

  • They finally found all those buried Atari cartridges, and confirmed a beloved urban legend. Just wonderful.
  • This episode of @ExtraCreditz follows up an idea I always had about education. The key is being demanding, but allowing a lot of opportunities.
  • Amazing book introduction, showing how no one is immune to think that they are stupid. Lots of things in live are hard.
  • Readability in code is not about being literary. Is about making the code easy to understand. You don’t read code, you explore it.
  • The Great Works of Software. The premise is extremely interesting. What are the most influential pieces of software?
  • The hilarious (is funny because it’s true) Programming Sucks and a follow-up What programming is Like.
  • Is programming a dead end job? I still can’t help but feel sad each time that a (good) developer decides to move into management.
  • It’s easy to forget how much the things have changed in term of software distribution. What Writing and Selling Software Was Like in the 80’s (yep, also from The Codist. You should subscribe)
  • The computer world is very dominated by English, and even so with latin alphabet. This idea about making a computer language in Arabic is fascinating. It not only shows how difficult is to set up an environment without problems out of “the ASCII world” (the magnitude is not comparable, but trying to code in languages like French or Spanish has a lot of friction), but it also shows up how alien (yet beautiful) a different alphabet looks. I wonder how code and programming will be if the dominant language would’ve been something like Chinese or Arabic.
  • What is the “Agile mindset” anyway? The graph is very interesting. Specially the “Chaos labeled as Agile” side.
  • I don’t really like the idea of “rivalry” against Vim and Emacs. I prefer to consider them two valid options. But this article goes into explaining their different appeals and why they have been around since an extremely long time ago in computer-years.
  • 10 Most common Python mistakes. Good to check.

Visual Programming and Mental Constructs

I saw yesterday live the Apple keynote on the WWDC. I am far from an Apple developer, but I use OS X and iOS everyday, and I’m interested on new stuff. There was a full section devoted to developers, which is great (well, it’s supposed to be a developer’s conference, after all), and, arguably, the most interesting stuff on that part (for a developer’s perspective) was the release of a new programming language, Swift.

It was announced with an (irrelevant) comparison with Python in terms of speed (I actually have plans to write a post about “why Python is not really slow“, but I digress), as well as a lot of other details that (IMO) are completely pointless in terms of what makes a good or bad programming language.

Most pointless benchmark ever
Most pointless benchmark ever

I am generally skeptic about the announcement of new languages. Almost as much as new web frameworks. Sure, it adds a new flavour, but I’m not that sure about real advancement in tech. Creating a new language, full with proper “clean and beautiful” syntax is not really that difficult. The difficult part is to create a vibrant community behind it, one that loves the language and works to expand it, to push the boundaries of current tech, to make amazing applications and tools, to convince other developers to use it and to carry on the torch. The target of a language are developers. “End customers” couldn’t care less about how the guts of their products are done. “Ruby sharp? Whatever, I just need that it help us increase our sales

Interestingly enough, languages get a lot of character from their communities, as they embed their values on the relevant modules and tools. A great example of that is “The Zen Of Python“. There’s nothing there about whitespaces, list comprehensions or classes, but it reflects a lot of the ideas that are common on the Python world, values of the Python Community. Using a language is not just writing code, but also interacting with other developers, directly or even just reading the documents and using the APIs.

As mandatory as honor
As mandatory as honor

Obviously, Apple is a very special situation, as it can force developers to use whatever they like for their platform. Hey, they managed to create an Objective-C ecosystem out from nowhere, which is impressive. For what is worth, they can even tailor a language for their platform, and not to worry about anything else. iOS is a platform big enough for devs to have to learn the language and official IDE and use it. And I am pretty sure that in this case it will be an improvement over the previous environment.

But the one part that I am most skeptic about is the “visual programming” stuff. One of the “wow” announcements was the possibility of creating “playgrounds”, to show interactively the results of the code. That means that, for example, a loaded image will be available, or that a graph can be displayed showing the results of a function. And that’s the part that I’m not really that sure that is interesting or relevant at all.

Does it look cool? Absolutely. May it be interesting once in a while? Sure. But I think that’s the kind of process that, in day to day operation, is not really that useful in most kinds of programming.

Programming, more than anything else, is creating a mental image of code. Code can be a very complex thing. Especially on a big application. But normally we don’t need to keep the whole code in our mind. We only have to keep certain parts of it, allowing to focus in a problem at a time. That’s the main principle behind modules, classes and other abstractions. I can use OS calls to open a file, to draw some pixels on the screen, or to make a call to a remote server. All of that without having to worry about file systems, graphic drivers or network protocols. And I can also use higher level modules to search on files, create 3d models or make HTTPS calls.

And the amazing power of programming is that you are coding on the shoulders of giants. And on the shoulders of regular people. And on the shoulders of your co-workers. And on your own shoulders. That’s a lot of shoulders combined.

But a lot of that process deals with the unavoidable complexity of the interaction. And being able to move from an abstracted view to a more specific one, to look inside and outside the black box, is crucial. It may not be evident, but the mental process of programming deals a lot with that sudden change in perspective. This is one of the reasons of multiparadigm being a useful thing. Because you can move between different abstractions and levels, using the proper one on each case (especially for leaky ones).

And there are lots of those processes that are not easily represented with graphs or images. They are constructs on your mind: loops, flexible structures, intuitions on the weak points of an algorithm, variables changing values, corner cases… Showing all intermediate results may be detrimental to that quick change in perspective. Too much information.

There has been experiments with visual programming, trying to represent code as visual blocks in one way or another, since a long time ago (at least 25 years). They are useful in certain areas, but they are far from a general solution. There are also interactive notepads to allow easy display of graphs and help with the interactivity. iPython Notebook is an excellent example (and a very similar idea to the playground). But, again, I feel that those are specialised tools, not something that is that useful in most programming contexts.

I’m just skeptic. All of this doesn’t necessarily means that Swift is bad, or that those tools are wrong. Maybe the new X-Code will have a lot of amazing tools that will help create fantastic applications (I still don’t like IDEs, though). There are already people checking the docs and giving a try to the new language.  But I think that it has to show up how good or bad it is for itself, and by the developers that decide to use it. So far, it is just an announcement. I just feel that most that was said on the keynote was not relevant to determine whether it’s a good working environment or not, but was just a gimmick. Yes, obviously these kind of announcements are publicity stunts, but in this particular case it looks especially so.

Looks cool, but is not particularly relevant to how the mental process of programming works or what makes a language good.

Hmph. Visual blocks. Heh. Excitement. Heh. A Developer craves not these things.
Hmph. Visual blocks. Heh. Excitement. Heh. A Developer craves not these things.