Você está na página 1de 7

Fog Computing

P Aditya
Gokaraju Rangaraju Institute of Engineering & Techonlogy
(Autonomous under JNTUH)
Master of Computer Applications

Abstract
Fog computing is not a replacement of cloud it is just
extends the cloud computing by providing security in
the cloud environment. Similar to Cloud, Fog
provides data, compute, storage, and application
services to end-users.
Cloud computing promises to significantly change
the way of use computers and store our personal and
business information .With these new computing and
communication paradigms arise new data security
challenges . Existing data protection mechanisms
such as Encryption have failed to protect the data in
the cloud from unauthorized access.
We proposed a different approach for securing data
in the cloud using offensive decoy technology. We
monitor data access in the cloud and detect abnormal
data access patterns.When unauthorized access is
suspected and then verified using challenge
questions, we launch a disinformation attack by
returning large amounts of decoy information to the
attacker. This protects against the misuse of the users
real data. Experiments conducted in a local file
setting provide evidence that this approach may
provide unprecedented levels of user data security in
a Cloud environment.
Also Im going to elaborate the motivation and
advantages of Fog computing, and analyze its
applications in a series of real scenarios, such as
smart traffic lights in vehicular networks and
software dened networks

I.

Introduction

CISCO recently delivered the vision of fog


computing to enable applications on billions of
connected devices, already connected in the Internet
of Things (IoT), to run directly at the network edge.
Customers can develop, manage and run software
applications on Cisco IOx framework of networked
devices, including hardened routers, switches and IP
video cameras. Cisco IOx brings the open

source Linux and Cisco IOS network operating


system together in a single networked device
(initially in routers). The open application
environment encourages more developers to bring
their own applications and connectivity interfaces at
the edge of the network.
Cloud computing has become the buzz
word during the recent years. But it largely depends
on servers which are available in a remote location,
resulting in slow response time and also scalability
issues. Response time and scalability plays a crucial
role in machine to machine communication and
services. The edge computing platform solves the
problems by the simple idea of locating small servers
called edges servers in the vicinity of the users and
devices and passing to the servers some of the load of
center servers and/or users devices.

II.

WHAT IS CLOUD COMPUTNG?

Cloud Computing:
Cloud computing is a delivery platform which
promises a new way of accessing and storing
personal as well as business information. Cloud
computing refers to the practice of transitioning
computer services such as computation or data
storage to multiple redundant offsite locations
available on the Internet, which allows application
software to be operated using internet-enabled
devices.
In Existing data protection mechanisms such as
encryption was failed in securing the data from the
attacker. It does not verify whether the user was
authorized or not.
Cloud computing security does not focus on ways of
secure the data from unauthorized access.
In 2009 we have our own confidential documents in
the cloud. This file does not have much security. so,

hacker gains access the documents. Twitter incident


is one example of a data theft attack in the Cloud.

Disadvantages:
No body is identified when the attack is
happen.
It is complex to detect which user is attack.
We can not detect which file was hacking.
Cloud Computing Issue: Bandwidth
Transmitting and processing data requires bandwidth
The more data, the more bandwidth is needed.

emerging Internet of Everything (IoE) applications


that demand real-time/predictable latency (industrial
automation, transportation, networks of sensors and
actuators). Fog paradigm is well positioned for real
time Big Data and real time analytics, it supports
densely distributed data collection points, hence
adding a fourth axis to the often mentioned Big Data
dimensions (volume, variety, and velocity).
Unlike traditional data centers,
Fog devices are geographically distributed over
heterogeneous
platforms,
spanning
multiple
management domains. That means data can be
processed locally in smart devices rather than being
sent to the cloud for processing.

Current cloud computing models cant keep up with


the amount of bandwidth that will be needed.

III.

Fog Computing

Fog computing is a model in which


data, processing and applications are concentrated in
devices at the network edge rather than existing
almost entirely in the cloud.Fog Computing is a
paradigm that extends Cloud Computing and services
to the edge of the network, similar to Cloud, Fog
provides data, compute, storage, and application
services to end-users.
Fog computing is a paradigm
which extends cloud computing paradigm to the edge
of the network. Terms Edge Computing and Fog
Computing are often used interchangeably. Similar
to Cloud, Fog provides data, compute, storage, and
application services to end-users. This enables new
breed of applications and services.

CHARACTERISTICS OF FOG
COMPUTING :

Proximity to end-users, its


Dense geographical distribution
Support for mobility.

Fog reduces service latency, and


improves QoS (Quality of Service), resulting in
superior user-experience. Fog Computing supports

WHY
DO
WE
COMPUTING?

NEED

FOG

In the past few years, Cloud computing has provided


many opportunities for enterprises by offering their
customers a range of computing services. Current
pay-as-you-go Cloud computing model becomes an
efficient alternative to owning and managing private
data centers for customers facing Web applications
and batch processing Cloud computing frees the
enterprises and their end users from the specication
of many details, such as storage resources,
computation limitation and network communication
cost.
However, this bliss becomes a
problem for latency-sensitive applications, which
require nodes in the vicinity to meet their delay
requirements. When techniques and devices of IoT
are getting more involved in peoples life, current
Cloud computing paradigm can hardly satisfy their
requirements of mobility support, location awareness
and low latency.
Fog computing is proposed to
address the above problem. As Fog computing is
implemented at the edge of the network, it provides
low latency, location awareness, and improves
quality-of-services (QoS) for streaming and real time
applications. Typical examples include industrial
automation, transportation, and networks of sensors
and actuators. Moreover, this new infrastructure

supports heterogeneity as Fog devices include enduser devices, access points, edge routers and
switches. The Fog paradigm is well positioned for
real time big data analytics, supports densely
distributed data collection points, and provides
advantages in entertainment, advertising, personal
computing and other applications

WHAT CAN WE DO WITH FOG?


We elaborate on the role of Fog computing in the
following motivating scenarios. The advantages of
Fog computing satisfy the requirements of
applications in these scenarios.

Smart Traffic Lights and


Connected Vehicles:
Video camera that senses an ambulance ashing
lights can automatically change street lights to open
lanes for the vehicle to pass through traffic. Smart
street lights interact locally with sensors and detect
presence of pedestrian and bikers, and measure the
distance and speed of approaching vehicles.
Intelligent lighting turns on once a sensor identies
movement and switches off as traffic passes.
Neighboring smart lights serving as Fog devices
coordinate to create green traffic wave and send
warning signals to approaching vehicles. Wireless
access points like Wi-Fi, 3G, road-side units and
smart traffic lights are deployed along the roads.
Vehicles-to Vehicle, vehicle to access points, and
access points to access points interactions enrich the
application of this scenario.

Wireless Sensor and Actuator


Networks:
Traditional wireless sensor
networks fall short in applications that go beyond
sensing and tracking, but require actuators to exert
physical actions like opening, closing or even
carrying sensors. In this scenario, actuators serving as
Fog devices can control the measurement process
itself, the stability and the oscillatory behaviours by
creating a closed-loop system. For example, in the
scenario of self-maintaining trains, sensor monitoring
on a trains ball-bearing can detect heat levels,
allowing applications to send an automatic alert to

the train operator to stop the train at next station for


emergency maintenance and avoid potential
derailment. In lifesaving air vents scenario, sensors
on vents monitor air conditions owing in and out of
mines and automatically change air-ow if conditions
become dangerous to miners

IoT and Cyber-physical systems


(CPSs):
Fog computing based systems are
becoming an important class of IoT and CPSs. Based
on the traditional information carriers including
Internet and telecommunication network, IoT is a
network that can interconnect ordinary physical
objects with identied address. CPSs feature a tight
combination of the systems computational and
physical elements. CPSs also coordinate the
integration of computer and information centric
physical and engineered systems.
IoT and CPSs promise to transform
our world with new relationships between computerbased control and communication systems,
engineered systems and physical reality. Fog
computing in this scenario is built on the concepts of
embedded systems in which software programs and
computers are embedded in devices for reasons other
than computation alone. Examples of the devices
include toys, cars, medical devices and machinery.
The goal is to integrate the abstractions and precision
of software and networking with the dynamics,
uncertainty and noise in the physical environment.
Using the emerging knowledge, principles and
methods of CPSs, we will be able to develop new
generations of intelligent medical devices and
systems, smart highways, buildings, factories,
agricultural and robotic systems

SECURITY IN FOG COMPUTING:


There are various ways to use
cloud services to save or store files, documents and
media in remote services that can be accessed
whenever user connect to the Internet. The main
problem in cloud is to maintain security for users
data in way that guarantees only authenticated users

and no one else gain access to that data. The issue of


providing security to confidential information is core
security problem, that it does not provide level of
assurance most people desire. There are various
methods to secure remote data in cloud using
standard access control and encryption methods.
It is good to say that all the standard
approaches used for providing security have been
demonstrated to fail from time to time for a variety of
reasons, including faulty implementations, buggy
code, insider attacks, misconfigured services, and the
creative construction of effective and sophisticated
attacks not envisioned by the implementers of
security procedures. Building a secure and
trustworthy cloud computing environment is not
enough, because attacks on data continue to happen,
and when they do, and information gets lost, there is
no way to get it back. There is a need to get solutions
to such accidents.

when decoy information is being returned by the


Cloud, and hence could alter the Clouds responses
through a variety of means, such as challenge
questions, to inform the Cloud security system that it
has incorrectly detected an unauthorized access. In
the case where the access is correctly identified as an
unauthorized access, the Cloud security system
would deliver unbounded amounts of bogus
information to the attacker, thus securing the users
true data from can be implemented by given two
additional security features:
1.

Validating whether data access is authorized


when abnormal information access is
detected

2.

Confusing the attacker with bogus


information that is by providing decoy
documents.

The basic idea is that we can limit the damage of


stolen data if we decrease the value of that stolen data
to the attacker. We can achieve this through a
preventive decoy (disinformation) attack. We can
secure Cloud services by implementing given
additional security features.

Decoy System:
Decoy data, such as decoy documents, honey pots
and other bogus information can be generated on
demand and used for detecting unauthorized access to
information and to poison the thiefs ex-filtrated
information. Serving decoys will confuse an attacker
into believing they have ex-filtrated useful
information, when they have not. This technology
may be integrated with user behavior profiling
technology to secure a users data in the Cloud. .
Whenever abnormal and
unauthorized access to a cloud service is noticed,
decoy information may be returned by the Cloud and
delivered in such a way that it appear completely
normal and legitimate. The legitimate user, who is the
owner of the information, would readily identify

We have applied above concepts to detect


unauthorized data access to data stored on a local file
system by masqueraders, i.e. attackers who view of
legitimate users after stealing their credentials. Our
experimental results in a local file system setting
show that combining both techniques can yield better
detection results .This results suggest that this
approach may work in a Cloud environment, to make
cloud system more transparent to the user as a local
file system.

Advantages of Fog computing

Bringing data close to the user. Instead of


housing information at data center sites far

from the end-point, the Fog aims to place


the data close to the end-user.

Creating dense geographical distribution.


First of all, big data and analytics can be
done faster with better results. Second,
administrators are able to support locationbased mobility demands and not have to
traverse the entire network. Third, these
edge (Fog) systems would be created in such
a way that real-time data analytics become a
reality on a truly massive scale.
True support for mobility and the IoT. By
controlling data at various edge points, Fog
computing integrates core cloud services
with those of a truly distributed data center
platform. As more services are created to
benefit the end-user, edge and Fog networks
will become more prevalent.
Numerous verticals are ready to adopt.
Many organizations are already adopting the
concept of the Fog. Many different types of
services aim to deliver rich content to the
end-user. This spans IT shops, vendors, and
entertainment companies as well.

Seamless integration with the cloud and other


services. With Fog services, were able to enhance
the cloud experience by isolating user data that needs
to live on the edge. From there, administrators are
able to tie-in analytics, security, or other services
directly into their cloud model.

CONCLUSION
In Fog Computing we presenting a new approach
for solving the problem of insider data theft
attacks in a cloud using dynamically generated
decoy files and also saving storage required for
maintaining decoy files in the cloud. So by using
decoy technique in Fog can minimize insider
attacks in cloud.

Future of Fog Computing


With the increase in data and cloud services
utilization, Fog Computing will play a key role in

helping reduce latency and improving the user


experience. We are now truly distributing the data
plane and pushing advanced services to the edge. By
doing so, administrators are able to bring rich content
to the user faster, more efficiently, and very
importantly more economically. This, ultimately,
will mean better data access, improved corporate
analytics capabilities, and an overall improvement in
the end-user computing experience
Ciscos Ginny Nichols coined the
term fog computing. The metaphor comes from the
fact that fog is the cloud close to the ground, just as
fog computing concentrates processing at the edge of
the network. According to Cisco, fog computing
extends from the edge to the cloud, in a
geographically
distributed
and
hierarchical
organization.
Fog could take a burden off the
network. As 50 billion objects become connected
worldwide by 2020, it will not make sense to handle
everything in the cloud. Distributed apps and edgecomputing devices need distributed resources. Fog
brings computation to the data. Low-power devices,
close to the edge of the network, can deliver real-time
responsesays Technical Leader Rodolfo Milito, one
of Ciscos thought leaders in fog computing.
The Internet of Everything is
changing how we interact with the real world, Milito
added:Things that were totally disconnected from
the Internet before, such as cars, are now merging
onto it. But as we go from one billion endpoints to
one trillion endpoints worldwide, that creates not
only a real scalability problem but the challenge of
dealing with complex clusters of endpoints what we
call rich systems rather than dealing with
individual endpoints. Fogs hardware infrastructure
and software platform helps solve that.

REFERENCES
1.

http://www.cisco.com/web/about/ac50/ac20
7/crc_new/university/RFP/rfp13078.html

2.

http://www.howtogeek.com/185876/what-isfog-computing/

3.

http://newsroom.cisco.com/feature-content?
type=webcontent&articleId=1365576

4.

http://a4academics.com

5.

https://en.wikipedia.org/wiki/Cloud_comput
ing

7: Stratosphere
Situated between 10 km and 60 km altitude on the
edge of space, the stratosphere is named after the
different layers, of wind within it. But the extreme
altitude also presents unique engineering challenges:
air pressure is 1% of that at sea level, temperatures
hover around -50C, and a thinner atmosphere offers
less protection from the UV radiation and
temperature swings caused by the suns rays. By
carefully designing the balloon envelope to withstand
these conditions, Project Loon is able to take
advantage of the steady stratospheric winds, and
remain well above weather events, wildlife and
airplanes. Another reason for the establishment of
Google Balloons in stratosphere is transmission of
radio frequency waves because the stratosphere has
almost no effect on radio waves [9].
IV.

Grand Challenges

Balloon control: The huge challenge for Project


Loon is to figure out how to manage a free floating
balloon fleet by modeling wind patterns and
navigating simply by going up or down to find the
right airstream [10].
Balloon life: A primary focus will be exploring
alternatives for Google balloons. As Google balloons
can stay near by 100 days in stratosphere after
launching. Its because of the regular changes in
temperature and atmosphere. Another and main
reason behind the short life of balloons is less
protection from UV radiation because the ozone layer
is really thin and protection from direct harmful rays
of the Sun is minimal [2]. For team Google its biggest
challenge to extend the balloons life through which it
can stay more to provide internet access.
Signal strength: Internet services are wireless and
provided by the balloons from stratosphere where

atmospheric chances are very common and can affect


the radio frequency transmission. Due to this there
will be weak radio frequency signal strength and
could cause interference in broadcasting high-speed
Internet.
Wi-Fi connectivity: Its mobile generation and
most of the internet users access internet services
from their mobile phones. In current scenario most of
the network services provider provide mobile internet
with 4G speed. It will be grand challenge for Google
to make it popular because at present there is no
direct Wi-Fi connectivity from balloon to mobile
devices.
Flight control: Most important and biggest challenge
for Google is flight control and balloon control at a
time. If any of balloon get failed then to land the
balloon safely also by controlling the air traffic going
to be big task. As it also takes more than 4hours for
the successful establishment of a single balloon in
stratosphere till that time its required to control all
the air traffic.
Coverage area: A single balloons network
coverage is around 1250 square kilometer, which
means the around 4 lakhs balloons are required to
cover complete earth area. Its most required to work
on the total coverage area of the balloon.
V.

Rich Area for Future Research

Team google designing its own IEEE802.11


standards to develop high speed wireless internet
access as compared to fiber cable based broadband
services and to provide Wi-Fi connectivity to mobile
devices. Also Google-x will look for the balloon
design for its better quality through which it can
handle more atmospheric pressure and UV radiation
in stratosphere. Another area where google will need
to work for the automation of balloon and flight
controlling system.
VI.

Conclusion

This paper, significant events in Google Balloons and


understanding which have enabled this technology to
become progressively more capable and costeffective in a growing internet services and
applications. With additional research and

development,
significantly
applications are within reach.

more

valuable

Reference
http://www.google.com/loon
http://en.wikipedia.org/wiki/Projectloon
http://www.wired.com/gadgetlab/2013/08/googlexproject-loon/all/

Você também pode gostar