Game Development

Dev Diary – Oculus Rift F1 Demo

By | Developer Diary, Game Development, Oculus, Technical, Unity | 15 Comments

From the time we received the Oculus Rift development kit, in July 2013, we were very keen on working on something related to Oculus, since there weren’t any good games or demos to showcase its true potential. Although Oculus is still in development and it will take time for it to reach its true potential, we felt the need to create something for the new realm of gaming it opens up. We already had Leap Motion and the Xbox Racing Controller on hand so we decided to make a racing game owing to our prior experience in racing games.


Eventually we decided that a Formula 1 racing game would do wonders if we gave the gamers a real-time experience of driving a real Formula 1 car.

I did some research on the circuits of Formula 1 racing and found the Monaco Street Circuit quite fitting to our requirement as it provides a rich architectural view being close to the sea as well as some nice looking building along the track.

But as I started to gather more information on the circuit it became obvious that Monte Carlo circuit was not the right choice. The race circuit has many elevation shifts, tight corners, and narrow track. These features make it perhaps the most demanding track in Formula 1 racing. Due to the tight and twisty nature of the circuit, it favors the skill of the drivers over the power of the cars. Although we had Formula 1 fanatics in mind when we started off, putting a lot of effort into the physics of the car was impossible keeping in mind that we only had 3 weeks to finish the demo.

Motor Racing - Formula One World Championship - Monaco Grand Prix - Saturday - Monte Carlo, Monaco

On further research of racing tracks, I came across the Valencia Street Circuit which had the same geographical characteristics of Monaco and it provided ample beauty and rich architecture. Another fact that helped in locking down Valencia was that the last grand prix at the Valencia Street Circuit was held in July 2012, and since then the circuit is no longer host to the European Grand Prix. So in a way we are giving tribute to Valencia Street Circuit by making it our choice of track in the Oculus Rift F1 Demo.


In the research phase of Valencia Circuit, Google Maps and Bing Maps were really helpful in categorizing and identifying the monumental buildings and structures that stood out. Keeping in mind, that not all buildings need to be modeled in detail because we basically wanted to show them primarily from the driver’s point of view. Some buildings which had unique architecture were to be modeled in detail.


Then came the documentation phase, in projects like these documentation is key in order to complete it in a timely and organized manner. Also, it is important to note that although we were a team of 2 guys working on the assets, its best to have the documentation laid out in a manner that projects can be scalable. So we could easily be working in a team of say 15 people and be organized. We used Trello for documentation, listing buildings with their names and assigning level of detail for each building.

We marked around 40 Buildings throughout the 5.419 km track, and through Google Street View referenced each of those buildings in order to have ample data to model them.

For Texturing, we marked similar tile-able textures from our database, as we maintain a library of textures we have made previously on other projects which are always useful.

So, after completing model and texturing work for building we brought them into unity individually to have their LODs set up and ready for the demo.

Project ThirdCharm Initiated

By | Developer Diary, Game Development, iOS, Platforms, ThirdCharm, Uncategorized, Unity | 15 Comments
[imageeffect image=”1106″ type=”reflection” lightbox=”yes” target=”_self”]

We have been busy working on our next game, code named ThirdCharm. We are pretty bad with names so we just decided to use that for now until we land on a good name.

There is not much we can say or show about it but here is a concept art image and some untextured level designs that we have been trying out. Stay tuned.

[postgallery_grid id=”grid_thirdcharm1″ title=”Project ThirdCharm” data_source=”data-4″ orderby=”ASC” flickr_set=”No username entered” slidesetid=”ThirdCharm1″ content_type=”image” lightbox=”yes”]

A demo for oculus

By | Game Development, Oculus, Technical, Unity | 4 Comments

So these days we have taken upon a very cool project of creating a demo for the Oculus Rift. Now i wont share the details just yet as the details deserve a few photos but i solemnly swear we are up to no good.

So the core needs for the projects are that it should look good, like really good. So with this core goal in mind I ventured into a research of existing solutions and latest tech which i had been missing reading up on as for the past few months i had been very busy with Death Mile. Of course as Death Mile was created in Unity 3D and i had been working for and loving Unity for the past year I started my experimentation in it.

Being a graphics programmer at the core of my heart as i experimented with DirectX 11 tech which I only had been reading up for a long time but never got to work with, I went crazy. The endless possibilities with the now near complete exposure of all DirectX 11 properties in Unity made my mind go wild but as I ventured deeper I realized that I needed to back up a little and make a realistic goal that I can finish in the 3 weeks planned for the demo.

During this experimentation I kept working on a test scene with the amazing Marmoset Sky Shop for Unity as the core lighting solution but wasn’t testing the scene with the Oculus itself as it was supposed to be just a test scene for testing techniques and as I was waiting for my new GPU to arrive. Yesterday I finally tested the scene and ill have to say it changed my view about the project quite a bit. I had played around a bit with the Oculus before this and so I knew the resolution is quite low but I still expected it to show some details. But with the Oculus a scene with moderate realism (IBL for ambient term, simple ambient occlusion baked + SSAO, soft but stable shadows to reduce aliasing and a few tricks here and there but nothing mind blowing) could produce a very good looking image as the blurring removes most of the detail. The new problem set that I am focusing on at the moment is to improve image quality while using Oculus, an interesting paper on that, and to find alternates to problems with aliasing in Unity that need fixing too.

Google Play Game Services for Unity

By | Game Development, iPhone, Technical, Unity | 2,656 Comments

Google announced Google Game Play Services (GPG) for iOS and Android about 2 days ago at Google I/O. The most important part for us is the ability to save game progress in Google Cloud and realtime Multiplayer game support. This brings Google upto mark with iOS GameCenter.

We just created a plugin for Unity for iOS and Android for GPG with following features

  • Google Cloud Save
  • Leaderboards
  • Achievements
  • Signin/SilentSignin

Google Play Game Services Unity Plugin Demo

In coming weeks we shall add support for Android as well.

Multiplayer matchmaking will be a bit of work since state synch will have to be done from scratch just like it needs to be done for GameCenter. The Unity.Network would not be helpful in multiplayer without a Unity Raknet MasterServer.

The code is available as opensource at

Update June 16, 2013

The code now supports Android plugin as well.

Mobile World Congress 2012

By | Android, Blackberry, Game Development, General, Microsoft, Nokia | 4 Comments

I had a chance to go to Barcelona for Mobile World Congress 2012 at end February. Here are some pictures and what was interesting from game development perspective.


NVIDIA stand was big as usual and they had TV’s attached to Tablets. I got to see NVIDIA tablets running Tegra 3. I think the tablet was made by ZTE as a prototype.

NVIDIA Booth with Tegra 3 being the most highly visible product/tech


Huawei Smartphones and Tablet

Interestingly Huawei has started to manufacture Tablets and Smartphones. They looked really nice and polished. Something interesting for future. All running Android 2.3 (for phones) and 4.0 ( for tablets).


ZTE launched its smartphone called ZTE Era as well. Phone runs Android. No pictures for that either.


RightWare wan interesting company. They provide system called Kanzi UI to create 3D UI for mobile platforms. Don’t forget to checkout their demo video. They even had a device that had 3D display without glasses. The UI engine works using OpenGL ES and should be possible to use in-game for creating Menu etc. I was promised a trial licence and will know more when we try ourselves.



Qualcomm’s Snapdragon was very interesting. I had a short game (CS like?) with Manish who is Product Manager for Graphics, Gaming and Tools. I hope he forgives me for making him lose the game he was playing all day. I am excited to get hands on the tools they got and see what we can learn and improve for our games.


Blackberry had a big booth as well. Bigger than I was expecting. We have already released BLAZ3D for Blackberry and it was nice to chat with people there. The guys at developer connect section were very helpful and we discussed a few things and got some contacts to stay in touch and updated with RIM world.
The most interesting part was a Porsche running Blackberry Tablet OS (QNX).


Android booth was big as well. I stayed there for quite long and had fun chatting with guys. Complimentary IceCream sandwich and smoothie were awesome. I also took a ride in their slide as seen and got a printed picture from their automatic camera. The whole feel of Android booth was playful and it felt less business like there. Too bad they didn’t have Stickers this time, so no free android goodies. As for tech it was all usual Android tablets and HTC/Samsung phones on display.



Booths for both companies were separate and quite far from each other. Even in separate halls. Booth for Nokia was like whole hall in itself. Probably the biggest booth in the whole conference. There were some phones like Lumia but Iforgot to check on Nokia 808 PureView. We are hoping to release BLAZ3D soon for Symbian.



Microsoft had pretty much just a big blue wall. I am not sure if it was some sort of secondary booth but it hardly had anything. One interesting thing was a scoreboard where comparison of phones was made based on speed. Not sure exactly what it was but I think it was similar Smoked By Windows Phone challenge. Got some Windows Phone stickers only.



These are really interesting times for mobile game development. The processing, graphics and resolution is increasing rapidly for smartphones. There are more players entering the market that would shake things up. There are even more uses of mobile OS than just being on Mobile and Tablets (Porsche example).

Artificial Intelligence in BLAZ3D

By | Developer Diary, Game Development, Technical | 4 Comments

As we have already released BLAZ3D on iOS and Playbook, I would like to share a bit about the development process of the game. Specifically the Artificial Intelligence (AI) part.

BLAZ3D was initially planned to run on devices such as the Nokia N95, Nokia N82 and later on iPhone 2g. So, considering the specs of the phone we were quite restricted with the hardware. Now we needed to find a sturdy solution that could give us intelligent AI that acted smart, and yet won’t eat a lot of processing. Actually considering the amount of load that bullet physics was sucking on the device, we had allocated near zero space for AI.

Now the AI was supposed to be able to pickup powerups, use shortcut ramps, and avoid obstacles. With these goals in mind we started designing the AI, but we quickly realized that we needed to keep AI constantly competitive too, as the levels had a large variety in their difficulty so the AI would quickly become too easy to defeat or at times too hard.

So the solution we came for the near zero CPU intensive AI was to use simple splines. But not using only the traditional best path splines, but instead we distributed splines. Through each level we manually placed small regions of splines, connecting options for the AI.

For example a straight line to go through a straight piece of tunnel would keep going straight until we reached powerups. After which 3 splines would cut out of the main spline, and go through the powerups, and finally merging into one. We had some complex scenarios as well when we extended splines, to give the AI logic to turn sharp through the next turn and catch the near powerup, or to turn loose and catch the powerup near the far wall of the tunnel.

Each line had an identifier in its name and a list of children in the name to tell the AI options it had to choose the next spline. But this still meant the AI was pretty dumb, so to give it a bit more intelligence we simply added one more variable to each line, for the probability of using that line. Then we assigned each AI its own probability on load. So if there were 3 AI in a race, there would be one AI how would always be tough to defeat (much like the red enemy in pacman <sad to see that wordpress spellcheck tries to correct pacman>).

So by this simple variable of probability in each spline, we managed to suddenly bring life in each character with this equation:

[highlight type=”one”]finalProbability = nextLinesProbability + enemyProbability + Random value between 0 – 0.25[/highlight]

We looped through all the lines in the child list with this equation and if one of the finalProbability goes over 1, we go ahead with that spline. This simple system allowed us to give each AI a dynamic feel with just a few additional operations.


Openfeint extension for Marmalade

By | Android, Developer Diary, Game Development, iPhone, Technical | 12 Comments

We wanted to integrate Openfeint in our game BLAZ3D but there was no extension in Marmalade, the game engine we used. We decided to create an Openfeint extension for ourselves.
The extension and relevant documentation is available on Marmalade Github Community repo at[highlight type=”two”][/highlight]

The extension supports following API
[list style=”arrow” color=”blue”]

  • User login/approval
  • Display of Main dashboard
  • Display of dashboard directly to specific page like Challenges, Acheivements, Leaderoards (iOS Only)
  • Sending of challenges (iOS Only)
  • Highscore submission to leaderboards
  • Acheivement unlock
  • iOS Push Notifications for Openfeint (iOS Only)

Some sample code snippets (Not all API is available on Android as OpenFeint doesn’t support them either)


Initialising OpenFeint

s3eNOFSettingVal *settings = (s3eNOFSettingVal*)s3eMalloc(sizeof(s3eNOFSettingVal) * 6);
// Fill settings
// UIOrientation value
settings[0].m_intVal = s3eNOFUIInterfaceOrientationPortrait;
// Shortdisplay name
// Push Notification Setting
settings[2].m_intVal = 1; // TRUE/YES
// Sandbox Notification Mode
settings[3].m_intVal = 1;
// Disable User generated content
settings[4].m_intVal = 0;
// Disable ask for approval in debug mode
settings[5].m_intVal = 0;
s3eNOFArray array;
array.m_count = 6;
array.m_items = settings;

Launch Main Dashboard after login

// Open Main Dashboard
// s3eNOFlaunchDashboard();

Pull Information about all achievements for user

const unsigned int max_achievements = 20; // whatever we think is max we need
void* data = s3eMalloc(sizeof(s3eNOFAchievement) * max_achievements);
if (data == NULL) {
AppendMessageColour(RED, ("Couldn't allocate data"));
return false;
s3eNOFArray achArray;
achArray.m_items = data;
achArray.m_count = max_achievements;
// AppendMessage("Size of achievement %d and double %d",
// sizeof(s3eNOFAchievement),
// sizeof(double));
for (uint i=0; i < achArray.m_count && i < max_achievements; i++) {
// s3eNOFAchievement* ach = &((s3eNOFAchievement*)achArray.m_items[i]);
// AppendMessage("Ach at %p", &((s3eNOFAchievement*)achArray.m_items)[i]);
AppendMessage("Acheivement title %s", ((s3eNOFAchievement*)achArray.m_items)[i].title);

Update December 30, 2011

The extension now supports both Android and iOS. Android API is extremely limited by OpenFeint itself and thus most APIs in extension just return -1 and do nothing.

Analyzing cross-platform engines for mobile phones

By | Android, Developer Diary, Game Development, iPhone | 53 Comments

Quest for Mobile 3D Engine

When we started development on our first title, the first thing as usual was to find our needs and select an engine in accordance to them. After considerable debate, we knew what we really needed was a cross-platform engine for mobile phones which should have solid 3D apis and support as many platforms as possible. This hunt coupled with our naivety as it was our first title, led to us the EdgeLib Engine. Our experiences with it was scarring to say the least, but definitely a huge learning experience which we will leave at that for a post in the future. This made us hunt for a new stronger solution, that we can call our base, our motherland, and that is exactly what we found with Airplay SDK now called Marmalade.

Marmalade is what we call a truly cross-platform engine. Supporting over 6 different os’s, and truly abstracting all api from it is a very big deal, and providing extensive support and a huge array of helping tools is unseen for an engine that supports indie developers. Instead of going into all of its features i will just point out at a few that really caught my eye:

Memory Management

Marmalade provides a very nice solution to manage memory. Instead of giving fast alloc calls and calling it a day, marmalade has a system of buckets. You allocate your data inside buckets, and specify the size of each bucket in a configuration file. This way you know exactly how much memory you are consuming in which area. Secondly, with such a system it becomes easier for marmalade to manage memory, so memory leaks are immediately detected. Add to this system an internal system of tracing call-stacks and you get a system which points to the exact code block that’s causing the leak.

Scalable 3D pipeline

Airplay does not only abstract out basic OS API calls from all platforms, but, due to the largely varying hardware of mobile phones, it also has a scalable 3D pipeline. Support between GLES 2 and GLES 1.0-1.1 in itself is a huge feature and couple that with a low level fallback software renderer and you get a game that would run on any phone in the market, may it have a GPU or not.

The Simulator

Airplay has a solid simulator for emulating the game on the PC/Mac . This simulator extends its features way over what any other phone vendor has provided yet, with features like accelerometer simulation, changing graphics drivers, compass simulation and some very detailed metric of your game. It allows you to configure the simulator in real-time to go as close as possible to the real device. Other than that the metrics the simulator provides are a very handy tool to optimize, showing all API calls made and the number of times they were made. It also tells exact memory usage of each bucket, which is again useful in optimizing your game for low memory use.

The Extra Tools

Airplay provides some very neat external tools, which are integrated which the library very well. Airplay has a system of resource groups and build styles, which allows you to sort out your games data in nice little resource files with which you can easily create multiple editions of your game for different phones and handle all the assests easily using these files. The best part, we just need to specify the compression format in the resource file for each texture, and airplay automatically compresses them on first build, and loads them in-game from the compressed file. This removes a lot of headache like manually converting each texture for different compression techniques for different GPUs. Airplay also provides derbh archive module for game data compression, which coupled with texture compression can reduce your games data by 60-70%!


The feature set never ends for this beast of an engine. Of course it has its fall at times too but they are always very minor compared to the huge list of features it give us.


[highlight type=”one”]We’ll have to say Marmalade is one of the best mobile 3D Engine we have used for Mobile Game Development.[/highlight]


E3 2011 from a Mobile Game Development Perspective

By | Game Development, General | No Comments


I think from a mobile game development company’s perspective we can take a lot from E3 even though it is focused on high end console games. Top games were once again first person shooters. BattleField3 is highly awaited one and is also on top of my own list.

Shooting games are somewhat tricky to develop on mobile platforms. The controls on touch screen are not that great for an immersive gameplay where character can move in 4-8 directions while camera can be movedaround and buttons pressed for shooting. Sweaty hands add to the trouble of controlling. Many a times player finds himself touching the screen out of the joystick area trying to move the character.

Probably the most innovative ideas that are relevant to mobile gaming can be taken from Nintendo. The new controller announced (not released) by Nintendo called Wii U, closely resembles the features that a typical smart phone would have. A touch screen, gyro/accelerometers and good processing power. It has hardware buttons which gives it advantage over most mainstream smartphones. Augmented reality based War Card games could be very nice. Not sure if they already exist but placing cards on table and viewing the characters animation fighting and taking/inflicting damage seems like a good take on War Card Games genre. The combination of real world with virtual world is always amusing.

Nintendo 3DS is pretty awesome device and just may be we can take something from it as well. We don’t have the hardware to really display 3D media without glasses but we could use some of methods like stereoscopic display with glasses that could be stolen from local 3D cinema. I already have a few with me and will surely experiment with that.

One of the natural advantage that Smartphones or handheld devices(with accelerometers) have is the ability of full 360 degree of movement which you can’t do with console. You always have to face your Television while playing games on consoles.

At the end of day Mobile Phones have their own unique advantages/disadvantages and games that are built with these differences in mind can give much better experience than games ported from other platform.

Who are Nerdiacs?

By | Game Development, General | One Comment

We are a small team of developers who share the idea of creating excellent high end games for Mobile Phones. We are registered entity in Singapore as Nerdiacs Pte Limited with subsidiary in Pakistan. Our game development is done mostly in Karachi, Pakistan and marketing/sales is done via Singapore.

We started with our first mobile game idea of a Labrynth look a like. We quickly discarded the idea as better games already existed in that genre. We then decided to do another 2D game and soon scrapped it as we could never get it to look great and we got plenty of physics issues. These 2 games seemed like failures but as a team we learned a lot from these experiences.

Previous failures didn’t put us down but made us think of doing a better job and differentiate ourselves from most games on mobile platforms. We decided to move into 3D project knowing very well that this could take 6 months to 1 year in development. It has taken us longer than that but our believe in this game has stayed alive and we never felt the need to scrap this game.

We have even more ambitious plans for our next game titles and we are quite confident of our skills as a team now. We are confident that we will be regarded as one of best game companies in mobile platforms in a short period of time.

Contact us at or use the form below

[enquiry_form id=”footercontact” emailto=”” thankyou=”thanks, we’ll be in touch soon!” /]