Categories
Audio&Games

Gunmes

As Electronic Arts prepared to market Medal of Honor Warfighter, the latest version of its top-selling video game released in October, it created a Web site that promoted the manufacturers of the guns, knives and combat-style gear depicted in the game.

NYT.

I’m just saying and I’m going to go with one example,

Call of Duty makes a billion dollars of revenue every year. Hundreds of dudes and dudettes but mostly dudes line up when one of the main Activision franchise comes out. This core target of young males spends time online sharing a lot of interest in guns through discussions about the game or real life weapons. Guns that are accurately modeled and simulated by armies of engineers, sound designers and 3D artists. 40 million monthly active players. in 2011 alone, 21 new “First Person Shooter” games. The army itself uses them for training.

We game developers can’t disengage our responsibilities with the old freedom of speech trick while we know how powerful learning tools games are. We can’t say games build skills and then say of realistic shooters that no, they’re just “for fun”. Imagine 40 million players playing Sim City trying out solutions to make our lives in cities better.

If shootings are so wrong and we want to fix that problem, everybody should get onboard and be ready to change. Game industry included.

Categories
Me Myself&I

The perfect age, the perfect movies

I was watching E.T on TV and couldn’t help but think how great of a movie it is. But I remembered that I saw it at kind of the perfect age, seven. I say perfect age because a little younger than the movie’s hero which is perfect because when you’re seven, you really want to be ten and when you’re ten you really want to be fourteen, at least that’s how I felt. So you really get into a “fascination mode” watching someone a bit older doing amazing things with an E.T.

Same thing with watching Akira at 11 (full story here), when the hero is sixteen. Same thing watching The Matrix at 20, when the hero is in his mid twenties.

I feel lucky to have seen these three monuments at the theater at a perfect age but also, if they marked me so hard was because I didn’t know anything about what I was going to see. Like, nothing except the main poster before getting tickets.

Also, sound. From the first ten minutes of E.T. exclusively based on sound effects and no faces, to Akira’s unique music and soundscapes to the tightest audio I had experienced with The Matrix, they all left a big mark on my forehead and my earholes.

Also, social commentary. E.T. is clear about what its director thinks of divorce. Re-watching it, I see how personal it was to Spielberg and I find that it’s awesome to point things out with determination, but in a honest way in a science fiction movie. Akira’s cyberpunk tone seemed a little too dramatic to me in 1991, but in 2012 it feels oh so powerful: corrupted gigantic decadent and shallow society controlled by the army in 2019? It looks like we’re doing it, we already have the drones and the decadence with the 1%. Again for a science fiction, action movie what a great vision twenty years ago. The Matrix social commentary is too obvious and too painful: yes, people are mostly like the dude who wants the fake steak as we can see how people prefer the comfort of freedom-killing silos on the web instead of being in the real, server-side-I-install-my-own-shit and I’m free web.

So to me that’s where the art VS profit discussion falls short because these three movies have both. Strong vision, innovation but also enough compromises to make it to a larger crowd, would it be by cutting off the complexity (Akira) or spending time explaining what’s going on (The Matrix). Because these movies are widely considered landmarks and really punched me in the face, I think we should analyze more these successes instead of going for Indie VS AAA VS Art VS Profit pointless game discussions, creatively speaking.

Categories
Me Myself&I

Deverticalization

I wanted to write about Anil’s blog post. It talks about how the web was open and favoring fluid transmissions when today, the web is getting silo’d, heavily so.

And we all know where the inspiration comes from: Apple.

Companies like Twitter, Google, Facebook and most of the startup culture love this very special capitalism that just doesn’t compromise on anything.

I saw an article on how capitalism is breaking the web but that’s not true, you can have any kind of way of doing business. You can even make money with a totally free and open source system, or simply have a fee/year like Flickr and Pinboard.

The problem is greed. People can’t stop drooling at Apple’s margin and aura. So they start by becoming super cocky (Twitter and its developers’ relationship), edgy about design (redesigns of UI/UX all over the place) mute to users (Google not answering anything on its forums) and hostile to competitors (Google blocked Windows Phones to upload to YouTube, how lame is that).

And developers followed, lost in trying to follow over-changing APIs -when they exist- or trying to be the first to emerge as “the” one product/service that does it the best, first on the platform that boosted the vertical integration scheme as “the” thing to do: Apple’s.

How to change that?

Well, developers need to use what we learned from these very popular services to build excellent experience and apps, only without the dictatorship and silo part. For example, I really wish someone was building some clients for Dave Winer’s river of news thingy, it’s too complicated to set up for now to interest anyone, even me. But you developers, be the bridges that bring great ideas to users. Let the data flow. Always. It’s not only one of the internet basic, it’s been what made computers awesome even when sharing floppy disks. An amazing freedom, and absence of scarcity.

Developers need to make awesome native apps around open data. The browser is going down, let me have great native apps on every single platform, be platform agnostic. Stop trying to make me sign up and be part of a “community” if I just want to read news, simply make great apps. Stay simple, let me handle the social. Embrace what new platforms give you and build tools to empower users.

As for people, I should probably spend more time teaching around me how to use computers (90% don’t know the ctrl+f to search a page, a majority never heard of the “” you can use in a search engine). But at the same time, people are the worst. They never really want to get a better way to do things or exploit what their computers offer. Computer literacy is low and it’s the reason people flock to tablets and closed ecosystems, not understanding what they’re losing or about to lose.

Categories
Me Myself&I

Small adults

It’s not worse. It’s not better. It’s wrong.

I always feel uncomfortable when people think that dead kids is the “worst of evil”. There’s no gradient in death, it’s sad in every case.

We think that a 50 year old man killed randomly by a shooter is less of a big deal because he’s in the middle of his life but what if he was about to enjoy the best 20 years of his life after working hard for it? On the opposite what if the kid killed was going to ruin some people’s lives by drunk driving, hitting a family van? We’ll never know. It ended abruptly and all we have are our eyes to cry.

My cold ass orphan past probably makes me think this way but I never really considered kids as kids but as small adults. I don’t forget that they understand things, even if they can’t express them through language. I don’t forget that in some countries right now, some are soldiers and trained to kill people, or that in some culture kids have sex very early on or watch it. It weirds me out too but you can understand how this is possible if you don’t treat kids as little pure and cute things that they aren’t, sorry to hurt you Western normative culture. They are small adults with shitty experience and that’s why we need to nurture them, not because they’re virtually perfect, adorable little miniature of oneself. After four or five years on this planet, they are not.

I especially don’t like how people realize that children dying from bullets is a terrible thing when as Jezebel notes, “black children accounted for forty-five percent of all child gun deaths in the United States, despite being only fifteen percent of the child population.” Please read the report and think of things like “Between 1979 and 2009, gun deaths among white children decreased by 44 percent and increased by 30 percent among black children over the same period.”

Nobody gave/gives a shit. No one cares, it’s like black people are not part of society. But a tragedy happens and the nation’s emotional attachment to it is strong enough that it might change things. I hope.

Too bad all these dead black kids never had any impact on gun laws. I feel sorry because if they had, a lot of children include those white kids from Newton could be alive today.

So much for preventing the “worst of evil”.

Categories
Me Myself&I

Society stress

So now it’s about mental illness.

We know where that mental illness grows. Not the medical one, the societal one. It grows in people in an unstable society. A society that makes no sense in so many ways and fuck us up. When I read Gawker’s unemployment stories I see what can fuel the explosion of an individual. 2010 has seen the highest suicide rate in the US in 15 years.

What most boomers driving our Western society don’t get is that things really changed, that today you can make the best choices and it just doesn’t matter, you’re still going to possibly end up in shitty places through the weight of things you cannot control at all. The sense that as an individual things are more random than ever before, is terrible.

That’s like, the strongest despair you can get.

You don’t even need to be so mentally ill to snap out of control. It just happens. The Colorado shooter was an apparently brilliant dude, studying neuroscience. How could we see what was about to happen? We could not and no one could. On the other hand the Virginia Tech killer was receiving treatment, therapy and special education for years and that didn’t stop him. He wanted to replicate Columbine and did.

Look at the Empire State Building shooting, the 58 year old perpetrator went there to kill his boss after he had been laid off. No psychiatric problems ever, he just had enough after being evicted from his apartment. Hitting the streets at almost 60 is probably crazy hard to swallow.

We shouldn’t dismiss people’s pain and call for the convenient “he was just a crazy and sick person”, especially in a psychotic society teaching us to be good AND merciless. Which of course doesn’t yield good results especially with white guys because when shit hits their fans, they are the ones falling from the highest point. Everybody else is used to not have all the rights and therefore are more capable to deal with the shittiest shit. This guy went nuts because of the IRS and the government, crashing his plane in a building. Meanwhile, black people deal with the worst statistics possible -employment, incarceration, children death rate- and don’t try that much to make everyone pay for it. Black people even have the lowest suicide rate. Like Paul Mooney says, “white folks, thanks for making us tough.”

Anyway if it’s about mental illness, it’s about society’s not individuals’. This is the one we need to fix.

Categories
Audio&Games

State of GPUs and gamedev

I needed to visualize the power of our devices from tablets to desktops to understand a bit what is going on in terms of processing power. Today we have a large amount of choices, much more than before. If we look at how games run things are more complicated than ever: some games will run with nothing, others are better with a quad core CPU, others really love a big amount of video memory… Between programmer skills tools and architecture, things can go from 60 fps at all time to terrible stuttering and bad experience.

But what about the differences in raw power, just to understand what’s going on under the hood? Well I tried to make this chart and boy, it’s not easy. Nobody’s using the same criteria so I had to rely on GFlops which are not that accurate on what a processor can do But that’ll do for my little experiment.

 
Ta-da

Wow. I didn’t expect that tablets, notebooks and desktops would be separated by orders of magnitude. I thought tablets were really close to notebooks but it’s nowhere near, especially knowing that tablets numbers are for CPU/GPU where for notebooks and desktops it’s only the GPU part.

3D games can totally run smoothly on tablets but you can’t expect good AI or complex behaviors with complex 3D at under 100 Gflops. Therefore, games on these devices will stay simple for a long time (they can’t bump processing power as fast with fanless and battery-oriented designs compared to graphic cards).

In the notebook world things are a little crazier: the Intel HD3000 -a steaming pile of shit according to computer enthusiasts- is making a killing in sales despite being way under AMD’s power solutions (yes, at the same thermal envelope). To give an idea, the AMD A10 is roughly at the level of a X360. That’s already a lot of processing power.

But then look at the desktop: the GTX 560 Ti is the second most used graphic card on Steam -the first one is the HD3000- and it’s vastly superior crunching numbers, over 10 times more than what the best seller is doing! It’s not difficult to understand why game developers have problems to scale game engines if between two machines you have 10 times more power. And if we add dual, quad core or more CPUs, the amount of power you can get for under $1000 becomes ludicrous.

I included the biggest GPUs available today -$500 cards- to see how far we went and well numbers show it: it’s insane. And it’s single chip, some graphic cards have two of these monsters (2×4300!!)

What does it say for apps? There’s a tremendous, unused amount of processing power available today. Technology is way more advanced than what programmers can do. They are barely starting to use multiple cores, can you believe that? Code parallelization is a bitch so coders never rushed to it.

Why is that important? Because like Chris Hecker said, we never have enough of power. The more, the better. Better behaviors, better experience, more can’t be bad. The problem is the market goes toward the underpowered, can-last-14-hours-on-battery chips.

What’s the problem with that? Well although some would say it gets rid of bad programmers who can’t use brute force power to make their apps faster, it also pushes good programmers to optimize a lot. It’s great and all but at the same time, you want good programmers to build things, make the app better add a killer feature not make it run perfectly and smoothly, that should be a given.

So trends to come? 2D is going to stay strong. Unity will get even stronger. And I hope that we will find bridges between tablets and graphic beasts so that we can use them remotely to beef up our underpowered new mobile devices overlords when needed.

Categories
Me Myself&I

What happened to affordable?

Reading stuff on FM synthesis and computers I realized something: make this new technology available for as many people as possible was the 80s and 90s motto and we lost that.

The goal following a trend set by the end of WWII was to be affordable. Technology was hype, but not in a hipster way it was just obvious that we should all use it and from the IBM PC to the Yamaha DX7 to the Roland 909, they all aimed at making things available for those who didn’t have the money, the space, the knowledge sometimes to get “the best of the best”.

Through affordability and people’s creativity, these machines conquered the world and became ubiquitous, changing our culture forever. I love that.

Today, technology is snob. Machines are sold at a higher price because brands all want some of that crazy Apple margin so they all sell computers around $1000 instead of $700. People expect $1000 as a starting price now, they want “the best”, despite not knowing anything about much better deals (all these expensive laptops with shitty HD4000, seriously). We don’t even pay for our expensive phones anymore or if we do, we pay half a thousand dollars to look at pictures while taking a shit. Think about that. A 500 bucks tablet for your three year old kid? Come on now.

It’s definitely a change compared to emergent technology from ten or twenty years ago. Technology literacy didn’t spread and manufacturers are using this at their advantage.

The computer industry is now focused on selling pseudo-primo stuff -Intel ultrabook bullshit- to people who don’t understand anything about computers but who are ready to shove whatever price in to impress their friends. It’s not anymore about doing things, it’s about showing off.

All of sudden affordable equals cheap, but I don’t think it does. Netbooks are now going down but the last ones running on AMD allow you to play Crysis. On $450 machines. Doing more with less, people call that Great Value or Yield and usually love it.

So OEMs either sell low-margin cheap products like netbooks and Android tablets to tractor beam customers and then sell high-margin expensive products like ultrabooks. There’s a sweet middle ground to go for but more importantly, it’s no more a free market with choices if one company (Intel) dictates how things work through an oligarchy of vassals.

Categories
Me Myself&I

The now

If I had been told as a kid that I would be in Los Angeles making music in 2012 I would have been like, shut your face hole! And then proceed to high five myself in the mirror with a little dance.

But that’s not all there is. I’m under the biggest stress ever, the kind that takes a bit too many brain cycles. I compensate by nerding out like crazy but that doesn’t feel like enough.

Living between L.A. and Paris for almost four years, 2013 needs to be the year I settle. I’m lucky I have been able to travel but it’s exhausting and making audio really requires a big dose of stability.

Problem is I still am not certain where I will settle, it depends on things I can’t really control. Frustrating. So close.

C’est la vie, bite the bullet hang tight etc

Categories
Music

Genesis memory lane


One of the only console ever with a headphones jack and volume slider. Sigh. (poster here)

Some people grew up with the infamous SID or NES sounds. It was the new thing to them. To me it was the transition from these bleeps and blops to a much more detailed world through the one and only Sega Genesis, part of the last generation of consoles creating music and sounds through chips instead of simply reading audio files.

Mainly powered by the Yamaha YM2612 OPN2 sound chip, it is the sound I grew up with every Saturday at my friend’s place in the 90s. I didn’t owned one at that time but I would borrow it and listen to sound test menus over and over. This FM based sound chips series from Yamaha was also ubiquitous in Japan for 20 years. They also developed the DX7 synthesizer which anyone who grew up in the 80s/90s heard at least once in a song (look at the list of artists in bold).

Rounded basses, metallic leads, dreamy bells, razor sharp pads, dynamic snares -thanks to the second sound chip– and amazing musicality for such a limited -and yet so versatile- pair of synthesizers.

I don’t know if it was because Japanese composers were all into funk at that time, but I think the chip invites the funk. From bubbly Rhodes to slap bass emulation (hello Seinfeld) it sounds funky. So many games just sound like this video, some kind of progressive jazz funk, 90s FM Japanese groove. Definitely dated as a style, but I can tell you that the amount of mastering happening in this video is amazing. The dude knows the chip by heart.

Yamaha and musicians, thanks a lot. /drops a tear.

Categories
Audio&Games

U Wii U Wii U

R.O.B. Saying Goodbye
R.O.B, asking himself where his Virtual Boy’s at.

Fantastic Ian Bogost’s article on the Wii U. I like the fact that Ian demystifies Nintendo’s history and shows that as all companies do, they’ve been ferocious and merciless as they’ve also been innovative, genuine and honest with things like this Wii U kind of saying “sorry, we don’t know where gaming goes but here a new box to play with”.

Other game companies did the exact same: MS made console development a reality for anyone,  they did the Kinect experiment, Sony totally understood developers and an entire generation of gamers with the Playstation, Apple discovered the App Store magnet and touch devices etc and they all did some terrible shit too from greedy corporation-ish stuff to totally missing the boat. I like seeing this human side in massive corporations. Nobody’s perfect, even entities.

Developers argue and defend their worshipped brand -as usual- but the point to me in this Wii U case is more about whether or not Nintendo is getting better at what they were bad at: third-party support, online distribution. Better but still shitty. Whatever happens with this new console, things will probably benefit Nintendo and only Nintendo like the past 30 years.

A comment caught my eye:

Atari crap may have caused the downfall of consoles but not computer games. The glut of crap crushed the game industry but not people who loved to make games. The mainstream may have become disenchanted with consoles, but computer games defined the core. Nintendo was vital to re-establishing the industry and the mainstream, but without them, computer games would have continued to spread.

I saw that. The 8/16bit console era was exploding but so was the PC. It’s the start of darlings like ID or Epic who was in the 90s making over $100,000 selling and shipping games on floppy discs. There was money outside the console market and the best were doing fine, both in the US and Europe: Japan is the exception, not the norm. Looking at Japan as the future for games elsewhere is wrong: FPS never worked there and F2P doesn’t do so well here. Consoles became big in the West but PCs stayed and are getting stronger everyday.

Unfortunately, their childish game themes entrenched a cultural meme, that games should not be taken seriously. Whereas, computer games continued to produce a variety of mature content. I think Nintendo was good for the industry, but not for game culture.

Pretty much the opposite for Sony! It’s funny how culturally the childish visuals never took over older generations, it always grabs the youth though. The problem is we have more and more old people, the average gamer is now like 33 year old. But consoles and app stores are terribly narrowing what subjects we can explore in games because of censorship. Even Steam doesn’t allow Adults Only games. It’s weird!

And this is where the paradox lies: we used to make games for adults when money was really coming from kids and now we make child-looking, teenager-ish games when money really is coming from much older and mature crowds.