Artifical Feelings and Emotions | Artificial Intelligence.

Artificial Feelings and Emotions  are consider as impossible but now it is possible to develop a Computer program that can process these kinds of data and using that program to create a realistic humanoid Robot that can have  feeling and emotions just like a homosapiens (Human), I have created an Algorithm that will show you the process required to create Artificial Feelings and Emotions. Note if you want to make this program you need to Know Programming, any programming language will work fine but i recommend using C, C++, C#, Java, C#, VB.net, python etc.

Let’s begin the process to create Artificial Feelings and Emotions (AIFE) step by step.

Creating Artificial Feelings

First thing the user does is Some Action which  will start up the program for example a user touches some part of the body, when user touches the body the program needs to know which part have been touched and how much force is exerted on that part and what is the temperature on that part .According to my observation human mode changes or different parts of the brains(Amygdala, Temporal lobe, occipital lobe, Cerebellum, Entorhinal Cortex etc.) are activated when some of their body parts is touched. Human Response differently when different word are spoken together or some sounds is generated for example when human hears music different parts of the brain activated and behave together and the human body begin to feel relax but it doesn’t means that they will always feel relax when music is played some human react opposite of it and there fiscal expression also changes.

In this process i will use my observation to create AIFE program, i will use the following algorithm in form of Diagram.

First user will give a command or perform an action that command will be process in AI Algorithm which is itself a program. AI Algorithm will look in Data file which is the brain of the program data file will contain all data for example “What is your name or age” Answer “Alexander VII age = 15” these data file can be xml files or an sql Database files it depends on you which language and what format you want to access or store data in these files. The user Command is searched in data file after getting the required info the info is then passed to the Translator which will convert the sentence in to different language even thou translator is not required but you can use it any way if you are making a very powerful program like i did, after this the translator will give the results which can be or understand by the user. To Activate the feeling process we will need an sensor connected to arduino or any hardware component which is capable of determining force when user touches the sensor or you can use any touch device for testing purposes but for this example i have used an arduino and piezoelectric sensors.

The Method is connecting the sensor in X,Y array position even thou there are some scientist who are making Robot skin sensor but these sensor are created in the same way. Connect the Sensors in Matrix and then connect it to the arduino also connect the arduino to the computer. we will be using arduino to determine which sensor is activated in which position and how much force is exerted on that particular sensor for example when a user touches a sensor in this case the user have touched the sensor x[4],Y[3] when the sensor is touched it produces a small current or signal which can be understand by the arduino. we will write a program for arduino in C++ or arduino language to use these signals to generate feelings, we also need to Connect the Arduino program to Ai Program so we can use these signal in our program to determine actions in Datafiles for example a user have touced a sensor in X[4], Y[3] position the sensor send the signal to the arduino microcontroller which is linked to the arduino debugger ,The program which have written for arduino to use sensors will send the results in this case it is X[4], Y[3] to AI program. Ai program will use the results and search for the information that have written for that particular results like “don’t touch me here or it hurts“. Example

if(X[4],Y3 = touch); {search.position X[4],Y[3] in datafile;} else {do nothing;}

Well i recommend combing the two values because it will produce some errors if you are making a real robot skin that can response to touch. Example

Combining X[4],Y[3] = 43

if(pos.int(43) = touch); {search.position[43]  in datafile;} else {do nothing;}

you can also  can use arrays, variables and switches. You can use the same process for the touch device like touch screen monitor.

When the Program has detected the position it will search an Answer or movement for that particular position with some tolerance but in this case there is no tolerance because i am using only 10 sensors and You can also measure how much force is exerted on the sensors and also the temperature with the same method as the position.

if(pos.int(43) = 5N); {search.Position[43+5N] in datafile} else {do nothing;}

you need to do a lot of programming to make it behave like a human does.

Emostion

To give emotion to your program you will be using the same method but you will be using feeling + Special Words to Control the behavior of the program for example i said a joke to a program the program will look special words or special sentences with respect to voice tone (which is another process to increase accuracy but you can leave it) in the Datafile and look for approximate movement in this case it is to laugh or smile, when the user told the a  joke the program will start to laugh but you can also use tiny motors to control facial expression of the robot you can also use fiscal expression algorithm or object recognition algorithm to connect to feeling or emotion for example when your sad or happy your fiscal expression changes the program will recognize the expression and tell you “cheer up” or something else.

this is it if you have some troubles write it in the comments and it will be dumb for robots to have feeling and emotion because robots are perfect but you can make a real human with integrated machines and electronics with feeling ,emotion and much more.

i Will also be writing another blog on self-learning and self-awareness algorithms and programs.

 

About these ads

2 comments on “Artifical Feelings and Emotions | Artificial Intelligence.

  1. This is great and I would like to implement it in Python. But I think one can make it more powerful by linking some sort of learning component to it. The best would be to make it learn like our brain does.. or simply to use neural networks… for that first you would have to train the algorithm by assigning appropriate outputs with inputs and for you would have to supervise it.

    Although some circuits are innate, this is how most human learning takes place.

    • I have created an Algorithm for Self-Learning which means computer can learn as human does but it can learn much faster and remember for ever then a human being.i have already created a computer which talks and leran like a human but i havent connected that computer into a robot.i Would be uploading Self-Learning Algorithm Soon.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s