Going ahead with a new game — Deathfire

I am certain it has not escaped your notice over the past few months that I’ve been working on some game-related things lately. I am sure my posts and tweets about Unity were a dead give-away.

Well, I have decided that it may be time for me to share with you some of the things I’m doing, because with every new day that I am working on my current project, I get more excited about it. As you may have guessed, I am working on a new role-playing game. I have to point out, however, that it has nothing to do with Thorvalla, the project I tried to Kickstart a few months ago. Thorvalla is dead and off the table. There was not nearly enough interest and support for the concept to make it happen, so that continuing on would have been a fruitless endeavor. Instead, I decided to learn from the experience as a whole and move forward.

Deathfire logo

The new game I am working on is called Deathfire… for now. It is kind of a project title currently, but the longer we’ve been using it, the more it grew on us and there is actually a chance we may use it for the final game. We’ll have to wait and see. There’s going to be a lot of water going under there until we cross that bridge.

There are currently three people working on Deathfire. Marian Arnold is the lead artist on the project. Marian used to work for my old company Attic, just after we released Shadows over Riva, and he has a pretty long gaming history himself, working on games, such as the Divine Divinity series. What’s even more important, however, is that he is a complete role-playing buff and immediately jumped at the occasion when I approached him with this idea. Being such an avid role player, he often serves as a sounding board for me, too, while I design the game and bounce ideas off him. Oftentimes he comes back to me with comments, such as “We could do this and then do that on top of it, making it work even better.” So, all in all, I feel that Marian is a great complement for myself, forcing me to think, re-think and try harder all the time. The many code rewrites I had to do to try out and/or accommodate some of our cumulative ideas are certainly testament to that.

Then, there is Thu-Lieu Pham, who is also lending her artistic abilities to the project. Lieu is a classically trained illustrator and graphic designer, and her strengths lie squarely in the domain that oftentimes makes fantasy games so mesmerizing — the tantalizing look of characters and scenes. Many of you may recall the paintings she did for Thorvalla, such as the iconic dragon ship at sea scene that we used as the game’s main visual hook, as well as the female Viking character.

Currently, Lieu is busy drawing character portraits for Deathfire’s Character Generation. Instead of creating them in 3D, we decided early on to try and capture the look of Golden Era role-playing games. The covers by Larry Elmore, Clyde Caldwell, Brom, and Jeff Easley come to mind, right away. Call me old-school, but to me this kind of vivid imagery and paintbrush work is much more inspirational and engaging than a rendered 3D character.

And then, there is me. I am currently serving double-duty, designing and programming Deathfire. It is marvelously invigorating, I can tell you that, and it reminds me of the good old days when Hans-Jürgen Brändle, Jochen Hamma and I were making games such as Drachen von Laas, Spirit of Adventure or Blade of Destiny, the first of the Realms of Arkania games, which were, to a large degree, just the three of us working triple-duties, designing, programming and often also illustrating these games. Working with such a small team on Deathfire appeals to me very much and I am enjoying myself, perhaps just a little too much.

I’ve decided from the outset that I will be using Unity3D for the game. As you can tell from previous posts and some of my tweets, I have become a big Unity fan, as it puts all the right development tools at my disposal at a price point and level of quality that is unbeatable. The package has not let me down once so far – though I would like to say that 3D object import could be improved quite a bit.

Deathfire is using a first-person 3D role-playing environment, and I am glad that we can rely on the muscle of Unity to make sure that we do not have to limit ourselves because the technology can’t keep up. Unity may not be a bleeding edge engine, but it can sure play ball with the best of them, and the fact that it is so incredibly well thought-through, makes developing with Unity a lot of fun. More importantly, we can focus on creating the game, instead of the technology to run it on.

I know, you may have a lot of questions now, about the game. What, when, where, how… I’ll get to all that some time later down the line. For now, however, I simply want you to let the info sink in, and hopefully you’ll be as excited as we are. Visit this blog regularly. I plan on sharing more of Deathfire with you as time goes on. In fact, after some deliberation, I’ve decided that I will cover the development process like a production diary of sorts, right here on my blog. And also, don’t forget to follow me on Twitter (@GuidoHenkel) for a constant vibe-meter as to what I am up to.

Talk to you again soon…


The illusion that is UltraViolet

Recently I read the headline that the CEO of Sony Pictures thinks UltraViolet needs improvement. The headline made me chuckle because I could have told them that two years ago. In fact I pointed it out in reviews back then. These days I do not even bother to check for UltraViolet, because to this date still, it is completely useless. What made me chuckle is also the fact, %%% % % % that Sony CEO Michael Lyndon made the comments for all the wrong reasons. The fact that “it’s not easy enough to use” is not the reason UltraViolet fails and despite what he says, This is not the “early days.” Those were two years ago. Technology is moving fast, as we all know, and two years are a lifetime in the digital domain. During this time period, UltraViolet could have – and should have – matured into a solid platform. It didn’t, because unless it goes through a complete paradigm shift, it simply can’t.

The real problem with UltraViolet, from my point of view, is not so much its technical implementation but the actual presumptions the underlying paradigm makes. UltraViolet is a streaming video format for mobile platforms, and as such it has very limited value and even less applications.

Even though we live in a world where everyone is connected and always-on, watching a streaming movie requires a bit more than an Internet connection. It requires a broadband connection that is always-on, and that’s where the problems start.

My iPad, for example is Wifi enabled but has no 3G, which means that as soon as I leave the house, I’m disconnected, and without Internet connection, there’s no UltraViolet. Silly, I know. I really don’t watch movies on a tablet at home. That would be just weird. I have TVs around the house that have been installed for that very purpose, and I evidently bought a DVD or Blu-Ray Disc, because that’s where I got my UltraViolet copy from, so why would I want to view a movie in an inferior format riddled with compression artifacts and in low resolution when I could instead watch it in 1080p on a large TV screen?

So, the moment I *would* be interested in watching a movie on my tablet is the very moment that UltraViolet disconnects and becomes unavailable. Epic fail! The logic that this would make sense or would even remotely be attractive for consumers boggles the mind and it stuns me that Hollywood executives are evidently still not seeing the real problem with UltraViolet.

But let’s say, for argument’s sake, that I wanted to watch an UltraViolet movie on my iPhone. Not sure why anyone would want to watch a movie on such a tiny screen, but fair enough, let’s just say…

The problem I have now is that for some time already phone carriers have begun charging for bandwidth for the most part. The glory days when the iPhone was first introduced and you could get unlimited Internet and Data on your phone for 30 dollars a month are long gone. As a result I am very reluctant to stream a 1 gigabyte movie to my phone, exhausting my monthly data plan allotment in the process. But even you have unlimited data and don’t mind to pay through the nose for it, you still have to content that many carriers are throttling the bandwidth on many of these data plans. The result is degrading the quality of your video even further as it streams. Not to mention that connectivity or download speed are far from being guaranteed. Every AT&T user can tell you that. So once again, UltraViolet’s proposition and appeal falls flat in its face.

But let’s put all that aside for a moment, and let’s just assume I am still not deterred and really, really want to watch an UltraViolet movie on my iPhone. The problem now is that with all the crowd noise around me, it is impossible to actually hear the movie. (How I wish the guy yelling into his cell phone so you can hear it all across the airport would just shut up… yeah, you know the type.) Sure, I could use headphones or earbuds, but sadly I refuse to turn myself into a Borg just yet, and do not enjoy wearing an earpiece, or maybe, I simply forgot them before I left the house. Since UltraViolet does not offer subtitles either, I am once again flat out of luck, and once again UltraViolet has no value to offer.

Ah, my stop just came up, twenty minutes into the movie, and I am asking myself why I even bothered trying to watch a movie on the go. I don’t know about you, but I rarely have two hours – the equivalent of the length of a typical Hollywood movie – available to me while I am on the go.

So, with all that in mind, is the failure of UltraViolet to connect, really surprising? It is clear to me that UltraViolet is simply a bad idea that has no practical real-world application as long as it does not offer digital download capabilities in addition to its streaming services, and adds basic accessibility factors such as subtitles to the mix. It was created in a bubble and sold to Hollywood studios as a technological illusion at a time when the studios had licked digital blood and were zealously looking for ever-growing opportunities to resell their catalogs. Well, it’s a pipe dream and it’s not going to go anywhere anytime soon.


As I’ve become more active in the games industry over the past months, %%% % % % I’ve also paid more attention to gaming related links, %%% % % % as you can imagine. No, I’m not going to talk about the Playstation 4 announcement. I will leave the hyperbole to others who have more panache for the upcoming new console from Sony. It leaves me pretty cold, to be honest, and the PS4 has virtually no new features I care for. I do not need game play recording and I certainly do not need a “Share” button. What I really need are better games… and those have nothing to do with faster hardware or higher pixel rates. All the game demo videos I’ve seen running on the PS4 do not impress me. I am hard pressed to really pinpoint real highlights where I’d say the graphic elevate the game play to new levels. It’s just all more of the same, just as little more orgiastic.

While I was following some game related posts on Twitter and Facebook, I stumbled across the website Kultpower, and I thought I’d let you guys know about it, particularly my German-speaking readers. This website has an online archive of many classic German computer magazines. You can find scanned versions of magazines such as Happy Computer, Powerplay, Amiga Joker and others. Every page of the magazines has been scanned and is available in fairly large format, making it possible to read all the exciting news of yesteryear, going back something like 1985 or so.

For me personally, it is just fun to go back and check out the articles and reviews of my old games, such as this first preview of the Realms of Arkania games. This article was written when we first made the announcement that we would bring famed the pen&pencil game to computers, revealing details and screenshots for the first time. As such this article is real computer games history, so make sure to click on the pages for a larger view. I hope you enjoy the read as much as I did.

There are countless other gems in these archives, not to mention the small photo of myself from back then, posing in front of the top of the line computers back then – an Atari TT! Also note that the black box visible on the left side of the picture under the computer monitor was actually a development prototype of the Atari Panther, an ill-fated video game console that was cancelled and replaced by the Jaguar system before it ever saw the light of day. I had actually started development of a role-playing game specifically for that console when Atari pulled the plug on the hardware. Ah yes, good times…

Meanwhile, I am plowing away on my own little project here. I’ve spent a lot of time with Unity and since my last post I have decided to use NGUI for all my user interface needs. The package is absolutely fantastic and a real treasure trove. I constantly discover new features and things that are exceedingly helpful. For me, the $95 it cost me to purchase NGUI were well spent. Not only have I used NGUIs functionality throughout my current project, it has also offered me a lot of insight into effective Unity programming.

Another true gem that I’ve found is SceneSync, a Unity plug-in that allows collaboration of multiple people on the same scene. It is incredibly easy to use, plugs right into your project and works without a hitch. Combined with some version control system – I am using GIT for that purpose – it makes it breezily easy for people to work on the same project without risking to lose data or destroy one another’s work.

Because I found both NGUI and SceneSync so incredibly valuable, I am now constantly browsing the Unity Asset Store to see what other gems are lurking there. If you have any plug-in or script you can recommend highly, feel free to leave a comment below for me to check out.

So, what am I working on, you may ask? Well… I’m not going to tell you. At least not yet. Let’s just say for now that it includes Elves, Dwarves, Warriors, Wizards and such and a lot of virtual dice rolls.


Unity3D fully lives up to its reputation

In recent weeks I have spent a bit of time with Unity3D, the middleware package that gets a lot of acclaim — and deservedly so, I must say. Those of you who have followed me for some time, or know me, are aware that I am always very outspoken about issues I see in the things I do or work with. Whether it’s been my criticism of Qualcomm’s operations during the BREW era, whether it’s been Hollywood studios’ approach to DVD and Blu-Ray, or things such as the Kindle. When I see problems, I address them and offer up constructive criticism, because oftentimes I do have workable solutions to offer. The fact that on various occasions my comments have lead to actual change, shows me that my attempts not to be unreasonable have been successful. I usually just point out things where I have the distinct feeling that someone simply didn’t think things through.

The truly curious thing about my working with Unity3D so far is that I have not a single complaint. This is by far the most complete, reliable and comprehensive middleware package I have ever seen and I have yet to find an area where the package is weak or has true shortcomings. Sure, it is inevitable that the package won’t do exactly what I want all the time, but that is the nature of the beast. On the whole, however, you can tell that Unity3D has been put together who understand the subject matter, knew what they were doing and what developers need. There don’t seem to be a whole lot of useless because-we-can features and all the areas I have touched upon so far in my tests have all been thoroughly thought out with interfaces that are precise and complete. As a software developer one really can’t ask for more. And to think that the basic version of Unity3D is actually free… it truly boggles the mind.

When I started playing around with Unity3D one of the first things I had to decide was whether I wanted to program in JavaScript or C#. I’ve been working with JavaScript over the years, porting countless J2ME games to BREW, but somehow it never really clicked with me. I had never worked in C#, though I had heard some good things about it. Looking over it briefly it seemed very similar to C++, the language I’ve been using most for the past 20 years or so. It was that similarity that made me decided to use C# in the end.

Ironic, I know. Me, using a programming language developed by Microsoft… ah well, the world is full of surprises. But in all honesty, so far C# has been pretty good to work with. There are few things that I do not particularly like, instances like the lack of static array initialization, where the language has taken a few steps backwards as opposed to forward, but on the whole, it’s been cool and fun to work in C#. The main problem I have with it – and that’s really just a minor gripe – is that I think it is too user friendly. The language makes it possible to implement certain things in a very quick and easy manner, making it seem super-cool, but if you look under the hood, you realize that a tremendous code overhead has to be generated to make that functionality possible. Therefore a single two-line code snippet may result in brutal code bloat and a serious performance sink without the programmer ever suspecting it at first.

This is not C# problem, though. It is a trend that most modern programming languages have been following – including Java and JavaSCript – and while it seems great at first, to me as a down-to-the-wire programmer who used to count CPU cycles for a living in assembly coding, it is really the wrong way to approach programming. Most of today’s software is clearly evidence of that. Just look at the resource hogs that simple applications like word processors have become. It is not because they have so many more really cool features, but because their implementation is just totally shoddy.

But I digress…

So, I’ve been doing some work in Unity using C#. I’ve never been big on 3D programming. It’s never been something I was particularly interested in, but imagine my surprise when it took my no more than a few minutes to actually script a simple 3D application in Unity. A few hours later I was able to actually have a scene with some user control. This is hard core. There was virtually no learning curve to that point and what really took me the longest was looking up the C# syntax for what I wanted to do, because as it turned out, in the details, C# is quite a bit different than C++.

There is a learning curve, of course. No middleware could be without. It is the nature of the beast, but the fact hat it took me virtually no time to jump in and do things in Unity showed me that this was very well thought-through, because the API made sense from the get-go, and Unity’s naming convention is clear and to the point. No guesswork involved.

As I expanded my tests and began to do a little more specific programming, I had to spend some more time reading the documentation, but it was always a very clear process and I never had the sense of being overwhelmed – despite the fact that Unity3D has such a wealth of functionality that it could make your head explode.

Over the past week or so I ran into a small problem. I was doing some GUI programming and had planned to do some fancy visual effects with texture blending, only to ind out that Unity’s GUI class is somewhat limited — a lot actually. Some of the functionality I was looking for is simply not there. I tried to dig deeper but everywhere I looked the tenor was that Unity couldn’t do it, so I decided to look at NGUI, one of the third-party packages that is available. But just as I was playing around with it, a work-around for my problem came to my mind. so I switched back to my original project and tried it out, and indeed, I had found a working fix to draw GUI elements, using different shaders. This was a relief because I would not have to rewrite what I had done so far and could simply continue down the path I was going.

Even though this immediate problem has been solved, I now run into the problem that I would like to render a 3D object in front of the GUI, and Unity does not immediately allow me to do that. So, it’s time to figure out another work-around for this.

As you can see, there are still a few things I need to figure out and get to work, but I am hopeful. Perhaps, I will decide to use NGUI, after all, when everything is said and done, if only because it might allow me to get certain tasks done in an easier fashion. I haven’t decided yet, and that is what makes Unity so great and so much fun to play around with.


My latest project, THORVALLA revealed…

For the gamers among you, I have some exciting news to share. For the past months I have been working on a new project that took me back to my computer gaming roots. Teaming up with veteran game designer Neal Hallford, I have prepared a concept for a cool new computer role-playing game that we are currently trying to fund through Kickstarter.

As many of you may know, I’ve been developing computer games for over 30 years and most of them were role-playing games. For the past years I’ve diversified into different areas, such as my book writing, but the game bug bit me again this year, especially because so many of you seemed to still remember and enjoy some of the games I made, like the Realms of Arkania trilogy and Planescape: Torment. (For a cool look behind the scenes of the making of the cover of Planescape: Torment, don’t miss this blog post I made some time ago.)

Neal, has been in the industry almost as long as I have, and he was one of the co-designers of Betrayal at Krondor, a wonderfully rich PRG based on the books by Raymond Feist. Neal has gone on to work on games such as Might&Magic III: Isles of Terra, Dungeon Siege, Lords of Everquest and many others.

So here we are, teaming up and supported by a team of incredibly talented artists and programmers who are ready to bring our latest game, Thorvalla, to life. (Yes, I will not only co-design, but also do programming on the project, because I’ve always felt programming is my true vocation.) Thorvalla, as the name already suggests is a game steeped in Norse lore, a world where men and dragons are at peace and fight together to vanquish evil. It features a vast world with many cultures while at the core remaining true to a high fantasy setting that includes staple favorites like orcs, ogres and skeletons alongside cool monsters from world lore.

You can help us make this game a reality. Take a look at our Kickstarter campaign page for more information. We’ve sadly had a slow start and can use every bit of support we can find. So, if you are a gamer, or if you have friends that love roleplaying games, let them know. Talk about Thorvalla, tweet it up, put it on your Facebook wall or whatever else you can do to help us spread the word.


The Spirit of Poe a major bust

[Just a quick note here before you read this article. Since writing this blog post I have received a contributor copy of “The Spirit of Poe” from Jeremiah Wright. However, I still have to point out that I received the copy in mid-March 2013 and only after relentlessly sending emails, %%% % % % requesting a copy. Considering that the book was supposed to be published in October 2011, %%% % % % and was actually published in July 2012, this is a substantial delay, which was bridged over by deception and complete radio silence in-between.]

As many of you may recall, in the past year I have occasionally talked about The Spirit of Poe, an anthology that was designed to support the Poe House in Baltimore after it lost its city-sponsored funding. A company by the name of Literary Landmark Press put out a call to writers at the time, asking for submissions for the book and I was one of those who answered the call.

Sadly, things went downhill from there. At first it seemed minor. Delays prevented the book from making its 2011 Halloween publishing date. Okay, fair enough, I thought the timeline had been a tad unrealistic to begin with. but then the months started to drag on. Not a word from the publisher. Eventually I sent a message to Jeremiah “Jerry” Wright, the editor of the book who also goes by the name WJ Rosser, and asked for clarification. He explained to me that various circumstances held back the introduction of the book, which he felt was crucial to its credibility.
Very well then. More months passed and not a word form the publisher. Eventually the authors got upset as a collective and we started to email each other, trying to get to the bottom of this. At first Rosser tried to avoid the conversation by ignoring emails and questions. After some time he had to budge, though and offered more excuses, but promising the book would be available shortly, currently being typeset.
Sadly for him, someone actually checked with the company Rosser used to lay out the book and found out that they did not even have materials to work on the project. Again, we queried Rosser for comment. Reluctantly he responded, telling everyone that the company was wrong and that he had in fact delivered all the materials. And so it went, month by month.

Rosser never made any attempt to inform his contributors or the public about the status or progress of the book and one day, about a month ago or so, it popped up on Amazon. For the Kindle first, and then as a print edition.

Naturally, we were all very excited, especially when for the first time in a year, Rosser volunteered an email in which he stated that contributor copies had been sent out and should be with everyone within a few days. Well, weeks passed and nothing arrived. Not on my doorstep, and not on anyone else’s, it seems.
And that was when Jerry Rosser practically vanished…

At this point I sent five emails to him, asking for clarification what happened to the contributor copies. Not one of them he responded to. Other authors sent emails to him, asking for information and their contractually promised payment. Not a peep. Rosser all but ignored the questions. but there’s more. When people started to post questions on his website. He deleted them, and when people posted question on the book’s Facebook page, he also removed them as quickly as a button drop. When one author posted a negative review on Amazon’s website, pointing out the publisher’s fraudulent behavior, it, too, was removed within a few days—undoubtedly upon request by the Rosser, the publisher.

So, quite evidently, he is out there and he is monitoring what is going on, and deliberately refuses to talk, deliberately cheating the contributors out of their money and the obligatory contributor copies of the book.

It is not usually my style to openly comment on deals going sour and relationships going bad, but this time I felt compelled to speak up because I feel that not only I have been jilted, but many of you might be at risk of being cheated as well. Whenever someone purchases a copy of “The Spirit of Poe,” they expect the majority of the revenues to go to the Poe House for a charitable cause. Sadly, at this time, I have reason to believe that that is not happening.

Since Literary Landmark Press has cheated every single writer in the anthology out of their payment, and since the company has never provided any actual copies of the book to its contributors, there is little that would convince me to assume that the publisher is honest enough to actually make true on their promise to donate proceeds to the Poe House.

I wanted to bring this issue to your attention so that you may decide for yourself, in case you consider buying a copy. Meanwhile I will try to find a different outlet for the short story The Blackwood Murders that I contributed to the book, so that people interested in reading it will not have to actually support a crook.


Time to rethink Kindle content generation

The announcement of the next generation of Amazon’s Kindle has set the eBook world abuzz once again. Not only are the new models more attractive than their predecessors, but they also expand the market in new, untapped territories. For authors, this is great news, of course, but often, where there’s light there’s also darkness.

Kindle PaperwhiteIn this case, the cloud on the horizon lies in the technical specs of these new devices. With a bit of worry I have observed over the past year or two that the eBook market is becoming more and more fragmented. In a very bad way, it reminds me of the mobile game space I have also been working in, where, at times, it was necessary for us to build up to 200 different versions of the same app to make sure it properly supports all the handsets in the market.

While the eBook market is not nearly as bad, of course, there is an increasing trend of changes – or call them features and improvements – that can work like sand in a ball bearing.

Fortunately we have to contend with only two generic eBook formats at this time – MOBI/KF8 and EPUB – and it is easy enough to build eBooks for both formats from the same sources.

However, since the inception of the iPad, problems have cropped up that force eBook publishers and formatters to think very hard about what it is they want to do and how to achieve the desired effect. Fixed-layout books and their particular quirks, and the lack of a general standard to create them, is just one of the issues publishers have to tackle these days, and it is exacerbated by the fact that even within the Kindle line of products, it is not possible to really create specialized builds for each platform. A fixed-format Kindle Fire eBook will inevitably make its way onto a regular Kindle – where it doesn’t belong – because Amazon does not give publishers the possibility to create specialized builds. As a result Kindle owners will look at a book that is horribly mangled and probably unreadable, while it looks mesmerizing on a Kindle Fire. I am not sure in whose best interest that is, but that’s the way Amazon does it.

The reason I am writing about this is because according to Amazon, the new Kindle Paperwhite line of models offers 65% more pixels. In plain English, it means it has a higher resolution than previous Kindles. That is really great news in regards to sharpness of the text, of course, but from a formatting standpoint it causes certain problems. An image that was perfectly sized for the Kindle’s 600-pixel resolution to date, will suddenly appear much, much smaller on the page. In many instances, this will not be overly dramatic, but if you use images deliberately as a design element, it will force you to rethink how you approach images in eBooks. Just image how tiny the image will look like when it’s being displayed on the new Kindle Fire HD with a resolution that is three times as wide as that of the original Kindle.

How would you like your artful chapter heading to look like?

In the past I have sized images to suit the 600 pixel screen. It helped keep the file size in bay – why bulk up a book’s footprint for no apparent reason, especially since the publisher is being charged for the delivery of the book based on the size of the file. This approach may no longer work, however, if you want high quality images across the board.

I’ve been therefore rethinking my strategy and going forward I am sizing images to a higher resolution and then determine their on-screen size, using scaling through my CSS style sheet. This allows me to make sure the image will always appear the same on the display, without degrading it on higher resolution screens. If anything, it may degrade the quality scaling images down to the older Kindle models.

If Amazon offered platform specific builds for their line of Kindles, this would not be a problem, but things being what they are, a one-size-fits-all approach is necessary, and hopefully, this will do the job.

In many ways, I wish that Amazon would make me part of their Kindle design team or at least would allow me to work with them. After all, I’ve had over 35 years of experience as a software engineer in arenas that were a whole lot more complex than an eBook reader.

Many of you may remember my post 10 Things Amazon should correct in the Kindle from a year ago, and it is rather disheartening to see that virtually none of these issues have been addressed. In fact, if you look closely, not a single one of the issues has been addressed to date. While I have not seen a Kindle Paperwhite at this time, I doubt there will be many changes in the firmware that would address these issues. It seems to be more of a change in terms of the form factor and a hardware upgrade than a rework of the actual reader implementation – but I could be wrong, of course.

To me as a software engineer, author, publisher and professional eBook formatter, the omissions are truly painful to behold. amazon has done great things for books, by truly establishing eBooks as a reading medium, making it the new mainstream standard, all the while opening the doors for authors to publish their own work. All great achievements and I honestly doff my hat to Amazon for their incredible foresight and the vision they had during the past three years.

That, however, makes the technical shortsightedness all the more prevalent. All of the issues I raised before have been around since day one, and clearly someone within Amazon should have championed their correction. It did not happen. Not even when people like myself and others have called them out.

Amazon has never been a software or hardware developer before the Kindle and as such it was to be expected that there would be hiccups in the product and the delivery. No big deal. However, the market has reached such a maturity, that glitches like inconsistent text justification, the lack of transparency in PNG images and other omissions become glaring issues that should have been resolved two years ago.

The Kindle has to mature and it has to mature with foresight or we are gong down the road of mobile games, where you need 200 individual builds of an app. There are great developers out there who would have been happy to assist Amazon in their objective, but instead of embracing them, Amazon has often shunted them.

A command-line MOBIGEN program is just not the same as the luxury you get out of a program like Calibre. Amazon should have long looked into creating high quality content creation tools that help authors to increase the quality of their output. Too many self-published books are still created with an MS Word export or an InDesign plug-in that cause more problems than they solve.

Amazon should also have long started to put in place platform-specific delivery of eBooks, along with was for authors to properly set up books for each of these platforms.

Amazon should also have expanded their eBook format in ways that are truly practical without having to jump through hoops. The introduction of KF8 was a horrid debacle to say the least. Confusing authors and readers alike, the implementation is not what it should be – many things could have been implemented much more efficiently, making it easier for formatters to prepare the eBooks while also giving them a certain level of control over the appearance of their content. If you’ve ever tried to take a look at a black and white line-art image in the “Night” setting of your Kindle, you know what I mean, and the whole image sizing issue puts the dot on the i, I think.

I don’t want to harp on this unnecessarily excessively, but it also appears as if Amazon has long forgotten its pledge to bring KF8 support to the Kindle 3 generation of devices. As far as I can tell, that has never happened either, and yet, the train of model innovation moves on…

With all the new glitz and glamour that accompanies every new Kindle model, for publishers, each new generation brings with it a new set of challenges. It’s not necessarily a bad thing, but as I said, I wish Amazon would allow me to work with them to help them make these transitions as easy as possible, at least from a content creation standpoint. If anyone from Amazon is reading this, you know where to find me…


A stroll down the DVD memory lane

For the past weeks we here at the Henkel household had a period where we went back in time, of sorts, and did a lot of re-watching of movies we hadn’t seen in a long time. It has to do with Lucas now coming into an age where can appreciate many of the movies that Lieu and I love, particularly the comedies.

So I dug through the countless boxes of DVDs in the attic and picked out of a few flicks we had not seen in a while and that I thought Lucas would enjoy.

Now, I have to say that I have not watched a DVD in about five years. Ever since I switched to Blu-Ray, DVD just doesn’t really seem to cut it any more, but what can you do when the movie you want to watch has been released in 1998 for the last time and has never been upgraded since?

Regardless of that, however, the thing that struck me the most was the memory of these early days of DVD. It reminded me of when we started up DVD Review in 1997. It was a time when the Internet was in its infancy, still. Hollywood studios had no email addresses, most didn’t even have websites yet, let away one dedicated to their home video divisions. I remember sending out countless faxes to studios, getting on the phone with them, introducing DVD Review to them and telling them about our mission to help establish the DVD format as the home video format of choice. Some studios did not understand the concept how an Internet site could be of any value to them, but others had more foresight. I remember vividly that Polygram was the first studio to provide us with DVD copies for review. Kalifornia was the movie—and it had some serious compatibility problems, too, as I recall.

Boy, things have come a long way for sure in these past 15 years.

Death Becomes HerAs we watched one of the films the other night, Death Becomes Her, to be exact, I stared at the screen in disbelief for a moment. That was a fullframe transfer. A pan&scan transfer of a movie, in fact, that was cropped on the sides… Oh boy, yes, there was a time when studios refused to release movies in widescreen. I mean, no, they did not accidentally frame films incorrectly, they outright refused to release them in their proper widescreen aspect ratios.

And the next night, another memory came back to me while I was watching a movie and the image seemed horribly rough and jagged; the subtitles looking like the font from a Commodore 64. Yes, indeed, there was also a time when studios refused to create anamorphic transfers and used only a fraction of DVD’s actual potential.

Boy, am I glad those days are over. DVD has come a long way. Not only have fully 16×9 enhanced widescreen transfers become standard, especially with the incredibly fast adoption of widescreen televisions, but fortunately with Blu-Ray and high definition video and audio, we are experiencing movies at a completely different level these days. If you don’t believe me, go back in your library and pop in a DVD from 1998 or so. Chances are, if the grain doesn’t kill you, the lack of detail from the compression will. I was very pleasantly surprised how well some of the DVDs held up, though. You can clearly tell which studios cared about their films, and which ones created nothing but shovelware. Even back then, New Line created some of the most sublime-looking DVDs, as I was reminded.

It was nice to revel in this sense of nostalgia over the past weeks. Remembering the early days of DVD Review and how things have developed. Remembering the role we played in the nurturing and establishing of the digital video format in the homes. Few people remember this these days, but websites like DVD Review were crucial at that point in time to carry DVD beyond the level of early adopters. With our web presence, the many screenshots that accompanied each review back then, the constantly updated news feeds, we made it possible for movie fans to stay on the pulse of what was going on in Hollywood.

I still remember, writing the headline “Paramount is in!” back in 1998, and the excitement that came with it, as Paramount used to be one of the biggest hold-outs on the format and this was a major push for the fledgling DVD format. You can still find the news article in DVD Review News Archive, which goes back all the way to 1997.

But one major studio was still sitting on the fence then. 20th Century Fox Home Entertainment. I remember meeting with Steve Feldstein during a trade show in July. Steve was one of the VPs from Fox at the time, and he was the one in charge of all of the studios’ home video marketing. Rumors were solidifying that Fox’s announcement of support for DVD was imminent, so I confronted him directly, telling him that I knew Fox was about to announce. He looked at me with a smile and simply said “Then you know more than I do.”

It kind of set me back, I remember, but I later learned that this was just Steve’s way of carrying himself. Steve would never say a word too much. He just isn’t the kind of guy you can pull into a conversation and hope to get information out of him that he doesn’t want you to have.

Despite his flat-out denial, however, not two weeks later, 20th Century Fox Home Entertainment officially announced their DVD support, and once again, DVD Review was on the forefront bringing this eagerly anticipated development to movie fans around the country. At this point, DVD was clearly poised to become a success—how much so, no one was able to foresee, however, and I think everyone was surprised how quickly DVD took off and established itself as the home video format of choice, making VHS and Laserdisc all but forgotten relics.

Gradually, the importance of websites like DVD Review faded, sadly, as the mainstream press began to pick up on the success of DVD and studios were more interested in pitching their release to E! Online and their audience rather than sites catering to dedicated movie fans, who would most likely buy their titles anyway—or so the thinking went.

I am looking back on those times very fondly. We made great friends during those years, among the Hollywood studio community, as well as within the creative community, and it is this fondness that keeps telling me to keep DVD Review alive, even after all these years.


A closer look at Power Tuner HD

Today, I am actually opting for a change in pace. While I’ve been blogging about my books, my writing and the technology of eBooks for some time now, more recently I have turned my attention towards other matters once again. Nothing like a little change of pace to keep life interesting. 🙂

A few years ago I wrote an iPhone application called Power Tuner. This was during the early days when the iPhone was first released. As a guitarist, there were many occasions where I wished I had a tuner handy. Whether it’s at the guitar store where the ambient noise can be so much that it is hard to tune by ear, even using natural harmonics, or just at a friends place where a long-forgotten guitar makes a sudden re-appearance, a tuner is something that every guitar player should have in their pocket, just like a plectrum belongs in any guitarist’s wallet.

Being an electronic gadget, having a guitar tuners handy at all occasions was just not feasible up to that point, and as a result there were way too many jam sessions the world over, with flat tunings, perhaps permanently damaging listeners’ attitude towards you. I am just kidding, of course, but truth be told, I realized at that time that the iPhone would make a phenomenal guitar tuner that could be handy any time you’d need it. My idea was to replace a $100 musician’s tool with a $5 app.

Shortly after, I launched Power Tuner and it has been a slow but steady seller ever since. Over the years I had various plans for the tuner, and I tried to brand it with certain artists, but sadly that never materialized despite weeks and weeks of follow-ups and countless phone calls to artist managers. A few weeks ago I had a different idea, which was almost as good as the idea of branding the tuner, and I went to work to fully update the app.

The result is Power Tuner HD, a high definition version of the app that is enhanced for retina-display iPhones and the iPad. That was more of a side-feature, however. The real meat of the update was the inclusion of skins, the way I had intended for the branded version. Instead of using artists, however, I decided to use guitar paint jobs instead.

Brilliant, isn’t it?

Which guitar player hasn’t fallen in love the the gorgeous cherry sunburst of Ace Frehley’s Les Paul? Who hasn’t secretly admired the genius behind Eddie Van Halen’s wildly striped Frankenstein guitar? Fans of the Fender Stratocaster definitely get a kick of the deep sunburst paint job that has become so iconic in the hands of players such as Ritchie Blackmore, Robin Trower, or Yngwie Malmsteen. The array is endless and as guitar players, we all have a passion for great paint jobs.

Therefore, I have prepared and included a set of the paint jobs that I found most iconic into Power Tuner HD, allowing guitarists to not only get in tune, but to do so in style.

As you can see from the screen shots in this post, I have recreated all of the aforementioned guitar looks as selectable skins, along with other ones. Check out the George Lynch-inspired Kamikaze skin, or the skin imitating John Petrucci’s Picasso guitars he played during the 80s. A beautiful Tobacco Sunburst is also included, reminiscent of Gibson’s Les Paul guitars. Apart from the original Power Tuner Tuna Skin, I have also included a brushed aluminum skin that resembles a 19” rack unit.

I have plans for a number of other skins that I may release, depending on how popular Power Tuner HD turns out to be. I would love to create a skin that resembles Rory Gallagher’s paint-stripped Stratocaster look, and one that looks like Zakk Wylde’s Bullseye Les Paul. Most of all, however, I would love to add some Steve Vai-inspired skins. And who knows, I might also be looking into iconic skins suggested by Power Tuner users.

Even though Power Tuner HD is now geared towards guitar and bass players, I think it is important to point out that the software is much more versatile than that. With a wide frequency range, the tuner is really suitable for all sorts of instruments, including violins, violas and celli, as well as brass instruments and others. It can even measure the pitch of your voice. If you are a singer, Power Tuner can actually help you to practice holding your pitch, as well as practicing you to gain perfect pitch for your vocals.

And all of that for less than $5. So, without me going on blabbing about how cool Power Tuner HD is, head over to the AppStore and get yourself a copy!


Today I would like to welcome Angela White as a guest to my blog, as part of her blog tour to celebrate the release of her latest novel “Adrian’s Eagle.”

gun barrelWhen someone says the word Apocalypse, the mind immediately conjures up images of whole cities burning while zombies or crazed people run wild in the streets. There’s always arson and looting, rape and murder, and none of the innocent people caught in the crossfire have a weapon or even know how to defend themselves. Once a little time has gone by, all the characters, good or bad, pack heat, and self defense becomes as important as food and water.

In a real apocalypse, the same will be true, but it won’t be just other people that are dangerous. Alone or in a group, protection will be vital and even those who loath weapons and abhor violence will carry them. There simply won’t be any other choice.

Like the refugees in the clip below from my new release, survivors will have to go searching for these life-savers. As time goes by, guns and bullets for them will get harder to find. Stockpiles should be gathered during the weeks after the apocalypse.

“Seven very gifted survivors are destined to rebuild their country after a nuclear apocalypse…If they can stay alive long enough to find each other. Impossible to put down.” – The Review Blog

“Are you sure?” Adrian cut her off. “Don’t turn down destiny. Sometimes, you only get one knock.”
He moved toward the driver’s side and the air suddenly went cold, plunging the Eagles into instant alertness.
Angela blanched as a wave of panic swept over her. “Your gun!” The Witch ordered sharply.
“Boss, watch out!” Kyle’s hand dropped for the Glock, already knowing he couldn’t make it from where he stood.
The single shot seemed to echo forever and all of them, except Adrian, turned to see where it had come from.
Adrian stared at the dead rattlesnake by his tire, listening to its mutated tails twitch, and the Eagles around them stilled, waiting to see if she would be treated the same as one of the men.
“You have one request.”
Angela calmly re-holstered and used the moment to make it all official. “I’ve already asked it.”

Full of realistic and fantasy situations, the Life After War series is a combination of more than 7 genres, so there’s a good chance of everyone liking it and learning a few things about survival at the same time. You can get a free copy at the link below, of the first book in the series. It’s free for all of this year to celebrate the possible end of the world on 12/21/2012.

Adrian's EagleAdrian’s Eagles — Three months after the War of 2012, Safe Haven refugee camp has made it to South Dakota and now holds six of the seven special survivors meant to lead the rebuilding of their country -but it can’t be done until they find a safe place to settle… and who can think of peace when there’s a huge camp of foreign invaders less than a day behind their group and they only want one thing? Safe Haven and everyone inside the light.
Watch the trailer for this series
Free- The Survivors – The bestselling novel that started it all. – See on iTunes
More Scenes of the Apocalypse

Btw, a huge thanks to Guido for hosting me on my Scenes of the Apocalypse release tour. Have you read Dead by Dawn yet? It’s only $2.99! I just downloaded a copy to my Kindle. Gonna have a great summer of reading by the time I gather up all these new books!