Você está na página 1de 10

Actions from Thoughts!

Real-time Direct BrainMachine Interfaces

Submitted by:
R. Sitaram
EMAIL: silaram.8@gmail.com CONTACT: 9676225975

P.Manikanta
manikanta2138@gmail.com 9963202138

4TH ECE

ADITYA ENGINEERING COLLEGE


SURAMPALEM

ABSTRACT:
Science has made great strides in the past few decades towards uncovering the basic principles underlying the brains ability to receive sensation and control movement. These discoveries, along with revolutionary advances in computing power and microelectronics technology, have led to an emerging view that neural prosthetics, or electronic interfaces within the brain for restoration or augmentation of physiological function, may one day be possible. Real-time direct interfaces between the brain and electronic and mechanical devices can be used to restore sensory and motor functions lost through injury or disease. Hybrid brainmachine interfaces have the potential to enhance our perceptual, motor and cognitive capabilities by revolutionizing the way we use computers and interact with remote environments. Brain-machine interface provides a way for people with damaged sensory/motor functions to use their brain to control artificial devices and restore lost capabilities Mulating neuronal tissue and emerging developments in microchip design, computer science and robotics have the potential to coalesce into a new technology devoted to creating interfaces between the human brain and artificial devices. Such technology could allow patients to use brain activity to control electronic, mechanical or even virtual devices, leading to new therapeutic alternatives for restoring lost sensory, motor and even cognitive functions .

INTRODUCTION:
A brain-machine interface is an interface in which a brain accepts and controls a mechanical device as a natural part of its representation of the body. An immediate goal of brain-machine interface study is to provide a way for people with damaged sensory/motor functions to use their brain to control artificial devices and restore lost capabilities. By combining the latest developments in computer technology and hi-tech engineering, a person suffering from paralysis might be able to control a motorized wheelchair or a prosthetic limb by just thinking about it. Before humans can use brain-interface techniques to control artificial devices, they must first understand how the brain gives commands. Brain-interface might work by recording neurological activity over long periods of time. The electrical activity of millions of brain cells (neurons) can be translated into precise sequences of skilled movements. Coordinated neuronal activity also provides with exquisite perceptual and sensor motor capabilities. The new technologies augment the human performance through the ability to noninvasive access codes in the brain in real time and integrate them into peripheral device or system operations.

BRAIN COMPUTER INTERFACE AN EARLY DEVELOPMENT! This describes the principles of a communication system called Brain Computer Interface (BCI). With this system user can control applications by using his/hers brain activity alone, no peripheral muscles or nerves are required. The brain activity can used for communication by classifying the activity to different tasks, which correspond to the functions in used application e.g. pressing a key or moving a mouse. The user concentrates to different mental tasks, which activate different functional areas Of the brain. This activity is measured as the Electroencepephalogaphy (EEG), and from its certain features, usually the power spectrum of the EEG are extracted BCI is an interface in which a brain can talk with computer by 11. The computer system can learn what the brain is doing or going to do. 22. The brain can accept the command from computer. EEG SIGNALS & MEASUREMENT - AN OVERVIEW: The neurons in our brain communicate with each other by firing electrical impulses, this creates an electric field which travels though the cortex, the dura, the skull and the scalp. These electrical impulses are referred to as EEG. The fundamental assumption behind the EEG signal is that it reflects the dynamics of electrical activity in populations of neurons. Frequency bands of the EEG : . Band Frequency [Hz] Amplitude [_V] Alpha (_) 8-12 10 -150 -rhythm 9-11 varies Beta (_) 14 -30 25 Theta (_) 4-7 varies Delta (_) <3 varies Evoked Responses from the Human being: In order to communicate via brain activity, the user must be able to control the EEG signal. These types brain activities can be divided to two groups: Evoked responses, which are evoked responses by a sensory stimulus, such as flashing light, and spontaneous EEG signals which occur without stimulus, such as or mu-rhythm, which the user can learn to control with the biofeedback._ Evoked potentials (EP) require specific sensory stimulus. An example of EP is visual evoked potential (VEP). If stimulus is given is a form of a flashing light, the EEG over the visual cortex will have the same frequency as the flashing light. Event-Related potentials (ERP) are DC changes to a discrete event. The ERP is a response to a stimulus or an event and it either coincides or follows it after a short delay

Location Occipital/Parietal regions Precentral/Postcentral regions typically frontal regions varies varies

Event-Related Synchronization (ERS) and Event-Related DeSynchronization (ERD) are the AC changes to a discrete event. More accurately ERD/ERS is blocking of the rhythms due to sensory processing or blocking of the _ -rhythm due to motor behavior. Measuring the EEG signal: Deals with the measuring the difference in electrical potential between various places on the surface of the scalp. In an EEG measurement a potential difference between two electrodes is measured. The signals picked up by electrodes may be combined to channels or a channel corresponds to a single electrode. The signal is then amplified and filtered from artifacts and displayed on computer screen. Neural implantable electrodes: (100 electrodes and 400 micrometer separate) The number of needed electrodes depends of the type and location of brain activity and the number of channels available. A combination of electrodes used to study particular point in time is calledmontage.

BCI Methods Usually the brain computer interface is using the neural electric signal, such as EEG or multiunit activities. The most significant parts are the signal recording and processing. One of the biggest challenges in developing true brain-machine interface involves the development of electrode devices and surgical methods that are minimally invasive to allow safe, long-term recording of neurological activity. Although the EEG is an imperfect, distorted indicator of brain activity, it remains nonetheless its direct consequence. Also, it is based on a much simpler technology, and is characterized by much smaller time constants, than other non-invasive approaches such as magentoencephalography (MEG), positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). When it became possible to process digitized EEG signals on a computer, the temptation was great to use EEG as a direct communication channel from the brain to the real world. In order to create a communication channel from the brain to a computer, a BCI must first acquire signals generated by cortical activity. Figure above shows an example of a typical example of of a typical EEG acquisition device in BCI applications. Some pre-processing is generally performed due to the high levels of noise and interference usually present. Then features related to specific EEG components must be extracted.

System Components: A BCI system can be divided to four components. The first is the settings of the detection equipment, which is basically always based on the EEG techniques. The second is a system or an algorithm to provide a stimulus. The third is detecting the response for corresponding stimulus and the fourth is the actual control of the application of this interface . Operation: Operation of the BCI system is not simply listening the EEG of user in a way that lets tap this EEG in and listen what happens. The user usually generates some sort of mental activity pattern that is later detected and classified.

Preprocessing: The raw EEG signal requires some preprocessing before the feature extraction. This preprocessing includes removing unnecessary frequency bands, averaging the current brain activity level, transforming the measured scalp potentials to cortex potentials and denoising. Detection: The detection of the input from the user and them translating it into an action could be considered as key part of any BCI system. This detection means to try to find out these mental tasks from the EEG signal. It can be done in time-domain, e.g. by comparing amplitudes of the EEG and in frequency-domain. This involves usually digital signal processing for sampling and band pass filtering the signal, then calculating these time -or frequency domain features and then classifying them. These classification algorithms include simple comparison of amplitudes linear and non-linear equations and artificial neural networks. By constant feedback from user to the system and vice versa, both partners gradually learn more from each other and improve the overall performance.

Control: The final part consists of applying the will of the user to the used application. The user chooses an action by controlling his brain activity, which is then detected and classified to corresponding action. Feedback is provided to user by audio-visual means e.g. when typing with virtual keyboard, letter appears to the message box etc. Biofeedback: The definition of the biofeedback is biological information which is returned to the source that created it, so that source can understand it and have control over it. This biofeedback in BCI systems is usually provided by visually, e.g. the user sees cursor moving up or down or letter being selected from the alphabet.

HUMAN BRAIN-MACHINE INTERFACE:


It is possible to use brain signals to control artificial devices. As a consequence, there are devices that could accomplish this goal (for example, brainactuated technology, neuroprostheses or neurorobots).These devices are collectively called as hybrid brainmachine interfaces (HBMIs). The word hybrid reflects the fact that these applications rely on continuous interactions between living brain tissue and artificial electronic or mechanical devices. HBMIs incorporate two main types of application. Type 1: these devices use artificially generated electrical signals to stimulate brain tissue in order to transmit some particular type of sensory information or to mimic a particular neurological function. The classic example of this application is an auditory prosthesis. In addition, type 1 HBMIs includes methods for direct stimulation of the brain to alleviate pain, to control motor disorders such as Parkinsons disease, and to reduce epileptic activity by stimulation of cranial nerves. Type 2: Rely on the real-time sampling and processing of large-scale brain activity to control artificial devices. An example of this application would be the use of neural signals derived from the motor cortex to control the movements of a prosthetic robotic arm in real time. The design and implementation of HBMIs will involve the combined efforts of many areas of research, such as neuroscience, computer science, biomedical engineering, very large scale integration (VLSI) design and robotics.

BUILDING A HBMI:

The first of the many challenges associated with the development of any HBMI is the need to understand better the principles by which neural ensembles encode sensory, motor and cognitive information. To design a type 2 HBMI that uses brain-derived signals to control a prosthetic robotic arm, we will need to learn how to sample and decode the motor signals generated by neurons and how to feed them into an artificial device to mimic the intended movement. SCHEMATIC OF HBMI:

Recording Brain Activity: Fundamental parameters of motor control emerge by the collective activation of large distributed populations of neurons in the primary motor cortex (M1). Single M1 neurons are broadly tuned to the direction of force required to generate a reaching arm movement. In other words, even though these neurons fire maximally before the execution of a movement in one direction, they also fire significantly before the onset of arm movements in a broad range of other directions. Therefore, to compute a precise direction of arm movement, the brain may have to perform the equivalent of a neuronal vote or, in mathematical terms, a vector summation of the activity of these broadly tuned neurons. This implies that to obtain the motor signals required to control an artificial device we will need to sample the activity of many neurons simultaneously and design algorithms capable of extracting motor control signals from these ensembles. Moreover, it will be crucial to investigate how these neural ensembles interact under

more complex and real-world experimental conditions to generate different motor behaviors. The next step and one of the most difficult challenges is to define a strategy for extracting meaningful control information from neural ensemble activity in real time.

SIGNAL PROCESSING:
Variety of linear and nonlinear multivariate algorithms, such as discriminant analysis, multiple linear regression and artificial neural networks, are used to carry out real-time and off-line analysis of neural ensemble data. New developments in the design of brain-inspired VLSI, an exciting area of research aimed at modeling neuronal systems in silicon, may provide the means for achieving the type of efficient real-time neural signal analysis required for HBMIs. This technology may allow pattern recognition algorithms, such as artificial neural networks or realistic models of neural circuits, to be implemented directly in silicon circuits. From an implementation point of view, analytical neurochips are ideal as they could be interfaced with the instrumentation neurochip and be chronically implanted in the subject. The final component of the idealized HBMI is a real-time control interface which uses processed brain signals to control an artificial device. The types of devices used are likely to vary considerably in each application, ranging from elaborate electrical pattern generators to control muscles, to complex robotic and computational devices designed to augment motor skills. Design of a brain pacemaker that monitors neural activity using a VLSI chip designed to detect seizure activity.

When seizure activity is detected, the VLSI chip sends a signal to an implanted stimulus generator that drives either a nerve cuff electrode or a mini-pump for drug delivery, either of which can stop the seizure activity. HBMI for controlling a robotic prosthetic arm using brain-derived signals

. Multiple, chronically implanted, intracranial microelectrode arrays would be used to sample the activity of large populations of single cortical neurons simultaneously. The combined activity of these neural ensembles would then be transformed by a mathematical algorithm into continuous three-dimensional arm-trajectory signals that would be used to control the movements of a robotic prosthetic arm. A closed control loop would be established by providing the subject with both visual and tactile feedback signals generated by movement of the robotic arm. Potential applications of Type 2 HBMIs : A clinical application of HBMIs that could emerge in the near future aims at restoring different aspects of motor function in patients with severe body paralysis, caused primarily by strokes, spinal cord lesions or peripheral degenerative disorders. Advances in this rapidly growing field of research indicate that neural signals from healthy regions of the brain could be used to control the movements of artificial prosthetic devices, such as a robotic arm. These observations also raise the intriguing hypothesis that, by establishing a closed control loop with an artificial device the brain could incorporate electronic, mechanical or even virtual objects into its somatic and motor representations, and operate upon them as if they were simple extensions of our own bodies. Obvious application of output BMI is as motor neuro-prosthetic devices for paralyzed individuals who are unable to deliver movement intensions to muscles through a computer interface. This technology could allow patients to use brain activity to control electronic, mechanical or even virtual devices, leading to new therapeutic alternatives for restoring lost sensory, motor and even cognitive functions. By combining the latest developments in computer technology and hi-tech engineering, a person suffering from paralysis might be able to control a motorized wheelchair or a prosthetic limb by just thinking about it

CONCLUSION:
The idea of moving robots or prosthetic devices not by manual control, but by mere thinking (i.e., the brain activity of human subjects) has been a fascinated approach. Medical cures are unavailable for many forms of neural and muscular paralysis. The enormity of the deficits caused by paralysis is a strong motivation to pursue BMI solutions. So this idea helps many patients to control the prosthetic devices of their own by simply thinking about the task. This technology is well supported by the latest fields of Biomedical Instrumentation, Microelectronics;, signal processing, Artificial Neural Networks and Robotics which has overwhelming developments. Hope these systems will be effectively implemented for many biomedical applications.

References: (1) IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 51, NO. 6, JUNE 2004 (2) VARIOUS BOOKS ON BIOMEDICAL INSTRUMENTATION AND SIGNAL PROESSING

Você também pode gostar