Click here to Skip to main content
13,148,510 members (67,130 online)
Click here to Skip to main content
Add your own
alternative version

Stats

10.2K views
85 downloads
7 bookmarked
Posted 6 Aug 2016

IoT-VR with Unity,Intel Edison

Rate this:
Please Sign up or sign in to vote.
Virtual Reality with Unity,Intel Edison and MQTT.

Table of Contents

1. Introduction
2. Background
3.What we intend to do

4.The proposed pipeline
5.Let's start with unity
6.The unity asset store
7.The Unity Animation controller
8.Time for the MQTT client
9.Animation control
10.The MQTT thing in Unity
11.Google cardboard integration

12.Controlling The Puppet With hardware
13.Controlling The Puppet with Gesture

14. Controlling Puppet With Intel Edison IoT Device
15.Why the Hololens experiment failed
16.Conclusion

 

 

1. Introduction

In this tutorial we will cover a different perspective how we can use Unity and for virtual experience.We will work on an animation system give complete life to our character and add a different dimension by bringing in the scope of MQTT adding Intel Edison to feel the magic.In between we will also describe the animation system and controls on giving state transitions to our character while applying movement.

2. Background

This article is more of a research I was doing where I thought where can I fit in IoT in terms of VR.To start with for its easy learning curve I chose Unity IDE.It fits in the best for creating the project more lively. For the integration part research was made with communication medium for remote interaction with Unity and IoT device. After all details and feasibility I found out the best way to do it was with MQTT.For adding a character I had to go through lot of details what kind of characters fit in best for the project. I tried Mixamo for details in animation for the character and certainly I found it useful but for the project purpose chose and easy way from Unity asset store a character and free custom animations associated with it.Here in I share my experience for the project that got  created.

 

3.What we intend to do

Conceptually it was just a dream that I could make the things working out at place so it was a tough job to start with

The Unity part

1)We needed characters that can be used freely for our project purspose and we had very less time.Creating a character in Blender would take time(I know very less Blender alltogether!!).Obvious choice was something that was freely available in Unity.

2)Unity Asset store to the rescue :- I was working aroud which figures and characters were obvious for me so I thought a Doll like character will shape up and give the project good shape hence we started searching and goodness gracious really the store had great assets and then Unity Chan model as these model was well explained in Unity forum and it was very easy to tweak it.

3)Adding animations:-

As we found this character had lots of animation state to use.Hence the animation control came to the limelight.We studied lots of tutorial in implementing animations and then we were good to go.

The IoT part:-

This was the most trickiest part as we had to bring in a new dimension to iteraction within Unity.

1)Obvious thinking was how could I communicate between the Unity project and the IoT device, studied a lot then found out that mqtt was the best choice.

2)Now the next thing was to integrate the mqtt within Unity it was a tough job had to dig in deep then found out a github library that came to the rescue for me.

3)Next step was tweaking the code integrating it with our scene for further assistance for our project.I did it with getting to know the basics of it and delving in deep into the code.

4)I tried the first communication with mymqtt app to try mapping the movements broken down into different mediums and commands that will work and yes it was a success.

 

The proposed pipeline

The next figure shows the proposed pipeline for the project workflow.It was a tedious part on how I can manage the entire scenario at one place but as I gained knowledge different ideas came into view hence the flow.

Download the latest version of Unity to start with.Make sure you are targeting for which platform 32 bit or 64 bit.then push for the download.

The workflow

Let's break the workflow

1)We design the game concept at Unity ,the movement ,we make a scene add capabilities for interactions.Then we set up animation control and get it going with keyboard keys (up,down,left and right).It came along very well.

2)We started the process of how we can do the interactions with IoT levels then we searched a lot of content and found mqtt the best fit.We integrated it.

3)Next step was to see that if we were able to communicate with the unity scene directly .On and android phone installed the mymqtt app and then subscribed to the channel we configured in the mqtt client.We did basic interactions with numbers to see if we were able to communicate and it worked out well.So the next thing

4)We used Android Studio to create an app where we integrated mqtt to interact with the Unity scene with forward,backward,left and right movements.It went along superbly.

5)As it was Unity we converted it to WIndows 10 UWP(We decided not to do it as MQTT was not working with it)

Hence 5th changed to Google cardboard.

6)We integrated with RealSense.

 

Let's start with Unity

1)firstly we will have to download Unity for the setup.firstly you need to make sure that which platform you intend to install for 32 bit or 64 bit.

2)Go straightforward to the website download the executable.

Here in this page you will have an option for downloading the version of unity just start downloading.

In the next page you will have variety of options for downloading Unity we have chosen the free version.

Choosing free version allows us to test and build our app and also publish it.

In the next step you will get an option for downloading the installer.After downloading the installer we need to choose target platforms the basic ones being PC,android build and others.

After installation it is now our turn to select and start the project.

For the first time when you login it asks for your Unity ID(make sure you are registered,as with its help you can now download the asset from asset store otherwise you can also work offline)

We open Unity give a name to the project.Make sure 3d toggle option is selected.Then we click on create project.

Before going further now we need to know what unity asset store actually is.

The Unity Asset Store

Unity Asset store is a market place where you get project assets that you can easily use.Some of the great assets are free and some of them are paid.

Why use Asset Store?

If we are a single developer its very difficult to find time to actually design the character add animations and other logic to it.The learning curve is steep when we are trying to learn new 3d designing tools.To bring our logic into reality we need animations,characters that are pre built.We can easily engage our self to C# code and use the language to our full potential to bring the entire gaming experience to reality.We although have to figure out the animation control ourself.

The concept of doll

From my childhood I had this special fascinations with playing with dolls keeping it ourself when we used to get to bed,tell stories(yes those golden days were so much fun... I wish I could go back).We get readily attach to the baby dolls that and weave in fairy tales.Hence when I started this project I wanted to go back and relive it so we started.We had in our mind that we will have to make a simple system where it can move back and forth and easily controllable through our IoT device.

The search at Asset Store

We always wanted to give our applcation a classic fairy tale look and we found that one character and one low poly count model was very useful for us.

Unity Chan model

This Japanese model was too cute for our liking hence we thought we could use it.There was also a tutorial on how to use it.

The flexibilty was

1)Animations

2)Cuteness of the character

3)Easily configurable to our liking

Now for our environment setup we had to choose a low poly environment that could give our scene a special feel.Indeed I found one the "low poly count environment".It was classy and it was our type.

Mixing them both

The ideas was to create a scene where we can bring in both characters together to start with we took example from the already added asset.We opened up the demo scene to check the capabilties of the model.It was indeed cool.

Now it was the part of importing the low polycount model and test the scene.

After opening the test scene indeed we found it was worth working with.

The experiment time

We included the Unity chan model now it was for us to start the coding and configuration.

As the character was facing the other side of camera we had to rotate it so used the rotating tool in Unity to change the view of the character.

Now it was our term to decide on the animation control system.

The Unity Animation controller

The unity animator controller organises your animation control in a state oriented manner.The speciality is you can drive in different sets of animations in one place.We can also have subsets of animation it is called blend state.Now first of all We though we try out with simple animation transitions.

Let's start again

In the project tab we have to right click then we get options to create the animation controller.

We name it and then drag it to the Unity chan model animator controller.

Next we double click the animation controller we see the states.

Let's break the state machine system we have by default any state,entry state and exit state.

Make sure that apply root motion is unchecked.

Now lets bring in some animations we intent to work with we drag the wait animations to picture.

Its time to save the scene.Give it a name and save it.

Time for the MQTT client

 

What is MQTT?

as per wiki mqtt is a machine to machine "IoT" protocol.It's highly reliable and it is easy to transfer or transmit messages on fast and accurate manner.

 

 

We searched a lot in internet but there was one github link that came in for the rescue.

 

 

We downloaded it and imported it our project.We opened the example scene in unity.

It was able to send and receive MQTT message.

For our receiving.sending purpose we thought of a trick we used the mymqtt app from google play store so that we can send/receive data to channel.

MYMQTT app

link

The easy way because it was very easy for us to communicate.

The app has

Dashboard,Subscribe,Publish,Stored Messages and Settings

The settings is the most important section in our view.

Here in the main broker url tab we give in 

iot.eclipse.org

and port is 1883.

 

Animation control

 

Let's go back to animations again we have handled the mqtt part we will cover the code changes as we make the animation control going.

The structure looks like this after importing and adding the details in the scene.

Now as adding the wait animations we see in the animation controller that in orange color one of the waiting states its the default state of animation.

We can just right click on any state and then make it to default state.From the other waiting states we make a transition to the default state as shown in figure above.

Now we will start working on a C# script.

When we create the script it is empty.

 

We go to assets then create and then C# script.

This script will work as our point for controlling the animations. to start with the script is empty.

//
// using UnityEngine;
using System.Collections;

public class Play : MonoBehaviour {

	 Use this for initialization
	void Start () {
	
	}
	
	// Update is called once per frame
	void Update () {
	
	}
}

//

 

in the script first we need to assign animator hence we declare it as 

public Animator Animd;

getting access to  animator controller we have get the component that drives the animation.

Animd = GetComponent<animator>();
// </animator>

for getting different keys for interactions we declare it in update method it allows us to check if we have pressed keys in keyboard,touch control or other inputs

 if (Input.GetKeyDown("1"))
    {
        Animd.Play("WAIT01", -1, 0.5f);
    }
    if (Input.GetKeyDown("2"))
    {
        Animd.Play("WAIT02", -1, 0.5f);
    }
    if (Input.GetKeyDown("3"))
    {
        Animd.Play("WAIT03", -1, 0.5f);
    }
    if (Input.GetKeyDown("4"))
    {
        Animd.Play("WAIT04", -1, 0.5f);
    }
//

where "WAIT01" is the name of the animation "-1" gives the layer of the animation control system.The animation control system can be broken down into sublayers when you create a transition system.

"0f" it shows from which point the animation starts playing on.

Now we will work on the movement horizontally and vertically hence we need to declare the parameters.It will be a float format.

For horizontal and vertical we had to declare 2 parameters.

Now we need to go back to the script and declare it.

private float inputH;

private float inputV;

Now we need to detect the horizontal and vertical axis.

inputH = Input.GetAxis("Horizontal");

 inputV = Input.GetAxis("Vertical");

Now we need to the  put the values of inputH and inputV.

Animd.SetFloat("inputH", inputH);
Animd.SetFloat("inputV", inputV);

It's time now to create a new blend tree so that we can add a sublayer to the animation control

In this layer we will be adding movement animations where we can portray movements in left ,right,up and down direction.

We make the blend tree type 2D simple directional.

and the parameters as inputH and inputV.

In the motion list  we add different motion field  we add four of them for adjustment.

We need to update the values 

As we have added the movement now it's time to see how the sub layered blend tree look like.Add the Motion obtion we substitute with the apprpriate movement on left,right,up and down.

The movement mapped to motions. Now the sublayer looks like this.Blue dots are our animations and red dot is how we blend close.

Now we make a transition from Base layer to sublayer(as we have done before left click do transition to the sublayer).Now we select the transition clicking on the arrow connecting WAIT00 and move.

We have to uncheck "has exit time" option as doing that will allows us to not wait for the animation and go to the next state.

Now we go to the conditions option.Where we setup inputV and inputH and goes to particular condition when greater than 0.1.

Now we add another condition.

When inputV less than -0.1 then we go to the previous transition.We used multiple Blend tree transition for proper reflectiong of movement.

Next step is add rigid body to Unity Chan model and make sure gravity is unchecked.

Now we add the movement to the rigidbody we need to go back to the script.

Animd.SetFloat("inputH", inputH);
        Animd.SetFloat("inputV", inputV);

We add the movement based on the directions

float moveX = inputH * 20f * Time.deltaTime;
 float moveZ = inputV * 25f * Time.deltaTime;

 

Then we add those back to the rigid body component.
       rbody.velocity = new Vector3(moveX, 0f, moveZ);

The MQTT thing in Unity

Let's take a look at mqtttest.cs file

    using UnityEngine;
using System.Collections;
using System.Net;
using uPLibrary.Networking.M2Mqtt;
using uPLibrary.Networking.M2Mqtt.Messages;
using uPLibrary.Networking.M2Mqtt.Utility;
using uPLibrary.Networking.M2Mqtt.Exceptions;

using System;

public class mqttTest : MonoBehaviour {
	private MqttClient client;
	// Use this for initialization
	void Start () {
		// create client instance 
		client = new MqttClient(IPAddress.Parse("143.185.118.233"),8080 , false , null ); 
		
		// register to message received 
		client.MqttMsgPublishReceived += client_MqttMsgPublishReceived; 
		
		string clientId = Guid.NewGuid().ToString(); 
		client.Connect(clientId); 
		
		// subscribe to the topic "/home/temperature" with QoS 2 
		client.Subscribe(new string[] { "hello/world" }, new byte[] { MqttMsgBase.QOS_LEVEL_EXACTLY_ONCE }); 

	}
	void client_MqttMsgPublishReceived(object sender, MqttMsgPublishEventArgs e) 
	{ 

		Debug.Log("Received: " + System.Text.Encoding.UTF8.GetString(e.Message)  );
	} 

	void OnGUI(){
		if ( GUI.Button (new Rect (20,40,80,20), "Level 1")) {
			Debug.Log("sending...");
			client.Publish("hello/world", System.Text.Encoding.UTF8.GetBytes("Sending from Unity3D!!!"), MqttMsgBase.QOS_LEVEL_EXACTLY_ONCE, true);
			Debug.Log("sent");
		}
	}
	// Update is called once per frame
	void Update () {



	}
}

 

The point where we need to change the part in the code is here.

The client will be dircted towards our channel

client = new MqttClient("iot.eclipse.org", 1883, false, null);

Now we subscribe to the channel

// subscribe to the topic "/home/temperature" with QoS 2 
        client.Subscribe(new string[] { "rupam/data" }, new byte[] { MqttMsgBase.QOS_LEVEL_EXACTLY_ONCE });

 

The updated mqttclient cs file

    using UnityEngine;
using System.Collections;
using System.Net;
using uPLibrary.Networking.M2Mqtt;
using uPLibrary.Networking.M2Mqtt.Messages;
using uPLibrary.Networking.M2Mqtt.Utility;
using uPLibrary.Networking.M2Mqtt.Exceptions;

using System;

public class mqttTest : MonoBehaviour {
	private MqttClient client;
	// Use this for initialization
	void Start () {
        // create client instance 
        client = new MqttClient("iot.eclipse.org", 1883, false, null);

        // register to message received 
        client.MqttMsgPublishReceived += client_MqttMsgPublishReceived; 
		
		string clientId = Guid.NewGuid().ToString(); 
		client.Connect(clientId);

        // subscribe to the topic "/home/temperature" with QoS 2 
        client.Subscribe(new string[] { "rupam/ar" }, new byte[] { MqttMsgBase.QOS_LEVEL_EXACTLY_ONCE });


    }
    string msg = "";
    public static int Num = -1;
    void client_MqttMsgPublishReceived(object sender, MqttMsgPublishEventArgs e) 
	{
        try
        {
            msg = System.Text.Encoding.UTF8.GetString(e.Message).Trim();

            
            if (msg.Equals("STOP"))
            {
                //Num = 4;
                player.LR = -1;
                player.FR = -1;
                
            }
            if (msg.Equals("NO"))
            {
                player.LR = -1;
            }
            if (msg.Equals("FORWARD"))
            {
                player.FR=1;
            }
            if (msg.Equals("REVERSE"))
            {
                player.FR=0;
                Debug.Log("REVERSE");
            }
            if (msg.Equals("LEFT"))
            {
                player.LR = 3;
            }
            if (msg.Equals("RIGHT"))
            {
                player.LR = 2;
            }
            if (msg.Equals("JUMP"))
            {
                player.jump = true;
            }
            if (Num != -1)
             player.Num = Num;

        }
        catch
        {

        }

        Debug.Log("Received: " + msg+" Num="+Num  );
	} 

	void OnGUI(){
		if ( GUI.Button (new Rect (20,40,80,20), "Level 1")) {
            Debug.Log("sending...");
            client.Publish("hello/world", System.Text.Encoding.UTF8.GetBytes("Sending from Unity3D!!!"), MqttMsgBase.QOS_LEVEL_EXACTLY_ONCE, true);
            Debug.Log("sent");
        }
	}
	// Update is called once per frame
	void Update () {
        if (Num != -1)
            Debug.Log("Updating..." + Num);




    }
}

The key point of the updated code is

msg = System.Text.Encoding.UTF8.GetString(e.Message).Trim();

where we encode the message and trim it.It helps for recognizing the messages that are sent via mqtt in faster manner.And the place where we have handled the movements is the crux for the project.Stress upon the mapping part of the code where we exchange the commands of mqtt.

 

Now let's look at the changes we did to main player script.

For forward and Left values of the player we have assigned a value of -1.

For multiple walk movements together we have used inputH and inputV values respond to the mqtt commands.

using System.Collections;

public class player : MonoBehaviour {
    //internal static int Num;
    public Animator anim;
	public Rigidbody rbody;


	private float inputH;
	private float inputV;
	private bool run;
    public static int Num=-1;
    public static int FR = -1;
    public static int LR = -1;
    public static bool jump = false;
    // Use this for initialization
    void Start () 
	{
		anim = GetComponent<animator>();
		rbody = GetComponent<rigidbody>();
		run = false;
	}
	
	// Update is called once per frame
	void Update () 
	{
		if(Input.GetKeyDown("1"))
		{
			anim.Play("WAIT01",-1,0f);
		}
		if(Input.GetKeyDown("2"))
		{
			anim.Play("WAIT02",-1,0f);
		}
		if(Input.GetKeyDown("3"))
		{
			anim.Play("WAIT03",-1,0f);
		}
		if(Input.GetKeyDown("4"))
		{
			anim.Play("WAIT04",-1,0f);
		}

		if(Input.GetMouseButtonDown(0))
		{
			int n = Random.Range(0,2);

			if(n == 0)
			{
				anim.Play ("DAMAGED00",-1,0f);
			}
			else
			{
				anim.Play ("DAMAGED01",-1,0f);
			}
		}

		if(Input.GetKey(KeyCode.LeftShift))
		{
			run = true;
		}
		else
		{
			run = false;
		}

		if(Input.GetKey(KeyCode.Space))
		{
			anim.SetBool("jump",true);
		}
		else
		{
			anim.SetBool("jump", false);
		}

		inputH = Input.GetAxis ("Horizontal");
		inputV = Input.GetAxis ("Vertical");
        if (FR == 0)
        {
            inputV = -1f;

        }
        if( FR == 1)
        {
            inputV = 1f;

        }
        if (LR == 2)
        {
            inputH = -1;

        }
        if (LR == 3)
        {
            inputH = 1;

        }
        if (LR== -1)
        {
            //Num = -1;
           // mqttTest.Num = -1;
         //   inputH = 0;
        }
        if(FR==-1)
        {
            //inputV = 0;
            //mqttTest.Num = -1;
        }
        if(jump)
        {
            anim.SetBool("jump", true);
           
        }
        else
        {
            anim.SetBool("jump", false);
        }

        anim.SetFloat("inputH",inputH);
		anim.SetFloat("inputV",inputV);
		anim.SetBool ("run",run);

		float moveX = inputH*20f*Time.deltaTime;
		float moveZ = inputV*50f*Time.deltaTime;

		

		rbody.velocity = new Vector3(moveX,0f,moveZ);

        if (jump)
        {
            jump = false;
            Debug.Log("inputH=" + inputH + " inputV=" + inputV + " dt" + Time.deltaTime + "Num " + Num + " movZ:" + moveZ + "movX:" + moveX);
        }

    }
}





</rigidbody></animator>

This part of the code allows us to connect to MQTT and use the service.

 

Google Cardboard Integration

Integration to Google Cardboard was the final thing as Microsoft Hololens didn't support the MQTT hence decided to settle down with Google Cardboard.At first we started we were hoping where do we start but found the Unity starting point at Google Cardboard link.

 

Now we need to go to the Git repo to download it.

Save the download.Then extract it.

 

Now you need to import he google cardboard package in unity.

The scene that we used for our experimenting and integration purpose was the headset demo scene which had all the requirements fullfilled for our work.

The most important part for our scene was camera integration in terms of VR perspective.Then additional setup for thr scene.Let's start with it.In our scene the first thing that was to be copied was GVR reticle.The GVR reticle needs to be attached to the main camera.Next we had to to copy GVR View main,floor canvas,event system and overlay canvas.

After copying our scene looked like this while running.

Now we are good to go with Google cardboard the scene is compatible now with google cardboard.

We will now make the apk for the project.First of all we need to change the build configuration for the project to android.

Now when we work on the build its tarts compiling and building for android.

After the compiling process is over an apk file is created.

Controlling The Puppet With hardware

Now that we have added Mqtt gateway into the project, it's time to control it. For an Android based control, all you have to do is to generate Mqtt commands intuitively from the Android app.

We have shows how a simple Mqtt based remote control can be created in this article. Please refer the article and download the app.

Android Mqtt remote control ( YouCar Project)

Controlling The Puppet with Gesture

You can also use custom gestures for controlling the puppet.Please refer to this section of  YouCar.

You need to change the Mqtt channel. Keep the channel specific to your project or user so that when other users are using your app, their command doesn't interfare with the current user's command.We tried the gesture part (Intel Realsense)with both of us(me and Rupam far apart in different places in India one in eastern part and other in southern part we came up pretty interesting results.We were able to control unity character(or puppet) still far apart and that's the power of mqtt.

The pictures as using exchange using teamviewer.

Controlling Puppet With Intel Edison IoT Device

This part was worked upon by our other co-author Moumita Das and she created the different gestures for using with Intel Edison.Finally those gestures are mapped by mqtt to use as a wearable for our project.

Please refer to this article on Accelerometer based gesture recognition in Intel Edison by Moumita Das  for understanding the circuit and the code for gesture recognition and integrating IoT device gesture into the context of the current app. That article elaborates UP, DOWN, LEFT and RIGHT gestures. You can easily integrate FORWARD and REVERSE gesture with the framework and use that in current context. 

Why the Hololens experiment failed

 

We don't have a hololens device but we are extensively working on the hololens emulator(love to have one if somebody sponsors it :-) ) and have done lot of work reagerding it.We are also working on a book for hololens.First our idea was to get hold of the hololens Origami project.Each and every project I built was on top of it.Some recent experiment pictures.

When the project runs in emulator it gives the same unity splash screen.

Lastly we made a game with spatial mapping where we air tap(gesture) balls fall and we score points in free space where the holograms are placed.

Finally we thought we will bring the same expertise to work but didn't work.The general case was the .net framework for unity(3.5) mqqt was not ready to be referenced and we were unable to add the mqtt framework to it.We searched a lot in internet but we were unable to find the right answers.Some of the fellows at unity forum suggested that we can add the framework if we had the IL2CPP backend. But that extension is not there in the hololens technical preview.

The il2cpp backend is not available till now and hence we couldn't do it.We are keep on trying it and will ofcourse take different routes too.In the next bloack diagram we show the entire workflow we tried for hololens and where it failed.

 

Conclusion

 

After the complete experiment we have made a complete framework for Unity which can be extended to other games easily(we will be doing it) and can controlled by gestures,phones and other wearable device anywhere the only coonected medium required is internet.It was a fascinating experience.

 

History

 

Keep a running update of any changes or improvements you've made here.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Authors

Grasshopper.iics
CEO Integrated Ideas
India India
This member doesn't quite have enough reputation to be able to display their biography and homepage.
Group type: Organisation

116 members


Abhishek Nandy
Software Developer
India India
I am into software Development for less than a year and i have participated in 2 contests here at Codeproject:-Intel App Innovation Contest 2012 and Windows Azure Developer Challenge and been finalist at App Innovation contest App Submission award winner as well won two spot prizes for Azure Developer Challenge.I am also a finalist at Intel Perceptual Challenge Stage 2 with 6 entries nominated.I also won 2nd prize for Ultrabook article contest from CodeProject
Link:-
http://www.codeproject.com/Articles/523105/Ultrabook-Development-My-Way

Microsoft MVA Fast Track Challenge Global Winner.
Ocutag App Challenge 2013 Finalist.

My work at Intel AppUp Store:-

UltraSensors:-
http://www.appup.com/app-details/ultrasensors
UltraKnowHow:-
http://www.appup.com/app-details/ultraknowhow

Moumita Das
Software Developer Integrated Ideas
India India
No Biography provided

You may also be interested in...

Comments and Discussions

 
GeneralMy vote of 5 Pin
DrBill378-Aug-16 8:30
memberDrBill378-Aug-16 8:30 
GeneralRe: My vote of 5 Pin
Grasshopper.iics8-Aug-16 14:24
groupGrasshopper.iics8-Aug-16 14:24 
GeneralRe: My vote of 5 Pin
Grasshopper.iics11-Aug-16 5:10
groupGrasshopper.iics11-Aug-16 5:10 
GeneralRe: My vote of 5 Pin
DrBill3711-Aug-16 7:09
memberDrBill3711-Aug-16 7:09 
GeneralRe: My vote of 5 Pin
Grasshopper.iics11-Aug-16 10:38
groupGrasshopper.iics11-Aug-16 10:38 
GeneralRe: My vote of 5 Pin
DrBill3711-Aug-16 15:49
memberDrBill3711-Aug-16 15:49 
GeneralMy Vote 5 Pin
Chrris Dale7-Aug-16 13:20
memberChrris Dale7-Aug-16 13:20 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

Permalink | Advertise | Privacy | Terms of Use | Mobile
Web03 | 2.8.170924.2 | Last Updated 7 Aug 2016
Article Copyright 2016 by Grasshopper.iics, Abhishek Nandy, Moumita Das
Everything else Copyright © CodeProject, 1999-2017
Layout: fixed | fluid