Você está na página 1de 24

No.

1#

Mind driven cars? Not Yet! YET!

Fu t u r ology

Past t ech n ologies

GPU An alysis

IT
TI M ES
TODA Y

I N THI S
I SSUE
What's up, Doc? 03
Tesla Up to Date 04
Netbeans goes Apache 06
New NVIDIA GPU's 08
Old timeline 12
New timeline14
Echo at Home 16
IBM Quantum Processor18
Future Realities20
Link to Neural22

WHA T'S UP, DOC?


This is just a compilation of computer
technology
articles
and
news,
reviewing past, present and possible
futures this science could achieve.

Disclaimer advice:
This magazine must be considered as a educational project, and absolutely no commercial.
Advertisements included are just examples fake or actual marketing practices and have no
billing relation with editors. Some articles have been selected from esteemed sources and
are properly quoted by the exercise of fair use. If not linked or quoted articles should be
considered as creative commons (CC) by-Share alike. All articles are property of their
respective authors and editors.

Tesla k eeps u p t o dat e!


The conversation around Tesla?s
upcoming v8.0 software update
has revolved around the improved
Autopilot, and rightfully so with
the
new
radar
processing
technology to be introduced with
it. But for many the biggest
change will be the entire UI
overhaul.
Today we take our first exclusive
look at the new UI ahead of the
wide release of v8.0, which is
expected to be pushed to Model S
and X owners later this week,
thanks to new pictures from an
anonymous source. We were able
to confirm independently from a
separate source that we?ve indeed
located pictures from the latest
version of Tesla 8.0.
While the Autopilot update
is great for every Tesla
owner since the end of
2014
?
when
Tesla
introduced the Autopilot
hardware in every new car,
the new UI affects every
Tesla owner since the
launch of the Model S in
2012.
With this said, some of the
most
interesting
UI
upgrades actually only
affects Tesla vehicles with
Autopilot.

Tesla upgraded the Autopilot


visualization
on
the
dashboard to now enable the
rendering of the cars in the
correct angle relative to the
Tesla instead of just in parallel
to the trajectory of the Tesla:
The speed setting for TACC is
now inside the icon. As for the
the Autosteer icon, it is still a
representation of the steering
wheel, but now it?s inside a
blue circle when engaged,
which makes it much more
visible.

Here?s a before and after v8.0:


Now to a few new design features
that every owner can enjoy ? even
without Autopilot.
The most noticeable is probably
the ability to remove the menu
bar from the top in order to gain
some screen real estate for the
applications on Tesla?s 17-in
touchscreen. Instead of the fix
menu bar, some of the icons
hover over the application.
The focus was really on the media
app.
Another surprise in v8.0, Tesla
improved its regenerative braking
software.

The automaker warns that you


may experience ?slightly stronger
deceleration?:
It should translate to more energy
being recuperated.
That?s the bulk of the new user
interface. We also explained some
other features unrelated to the UI
changes in our original v8.0
report.
As CEO Elon Musk said last week,
Tesla is aiming for a wide release
of v8.0 on Wednesday.

Source: https://electrek.co

Tesla powerwall

ORA CLE NETBEA NS


GOES TO
A PA CHE
Or acle w an t s t o du m p it s Net Bean s Java
in t egr at ed developm en t en vir on m en t on
t h e Apach e Sof t w ar e Fou n dat ion .
Once upon a time, NetBeans was a significant
open-source Java integrated development environment
(IDE). Oracle, which has been backing away from Java,
took another move towards dropping Java as a priority by
dropping support for NetBeans, it's open-source
integrated development environment (IDE), tooling
platform, and application framework. It's written in Java
and is primarily for creating Java programs. It also,
however, supports other languages, such as JavaScript,
PHP, and C/C++.
NetBeans has a long history dating back to 1995. It was
acquired by Sun in 2000. Oracle picked up NetBeans as
part of its 2010 deal for Sun.
Since then, Oracle has been shedding Sun's software
programs. Larry Ellison, Sun's ruler no matter his title,
dropped OpenSolaris immediately. In 2013, Oracle
sunsetted most of Sun's virtualization technologies. The
most relevant of Oracle's past moves away from Sun's
software to the NetBeans situation is how Sun
abandoned OpenOffice. Now, OpenOffice is on its
deathbed.

Another significant issue is that Oracle


no longer makes Java a priority. The
much delayed core Java Enterprise
Edition (JEE) 8 may finally launch in
2017. Java creator James Gosling has
said, "It's not so much that Oracle is
backing off on EE, but that it's backing
off on cooperating with the [ Java]
community. Taking it 'proprietary',
going for the 'roach motel' model of
non-standard standards -- 'customers
check in, but they don't check out.'"

That's no real surprise. For years, NetBeans


battled with Eclipse over which would
become the dominant Java IDE. Eclipse
won.
True, NetBeans has its die-hard supporters.
Zoran Sevarac, a member of the NetBeans
Dream Team, for example, likes the
proposed deal. "It's a great thing, and it
means that NetBeans has an exciting
future. The NetBeans community is very
positive about this step and sees this as a
logical (and good) way to proceed."

Now, with NetBeans, Oracle has


proposed that the Apache Software
Foundation take over the project. In the
proposal, Oracle claims that NetBeans
still has 1.5 million developers. I don't
believe those numbers.

Gosling, in a Facebook post, agreed.


"NetBeans is moving to Apache! Oracle has
decided to open up NetBeans even more,
so that folks like me can more easily
contribute to our favorite IDE. The finest
IDE in existence will be getting even better,
As Janel Garvin, CEO of Evans Data, a faster!"
company that tracks what languages It's a nice thought, but the community is
and tools developers use in the real small and getting smaller still. Still, unlike
world, told me, "Eclipse shot past OpenOffice, NetBeans does has significant
NetBeans years ago in usage. We programmers who want to improve it, so
stopped asking about NetBeans a few perhaps NetBeans may yet reinvent itself.
years back because no one cared about I'm just not betting on it.
it anymore."
By Steven J. Vaughan-Nichols for Linux and Open Source

Th e NVIDIA n ew
GPU's Fou n der s
Edit ion s Review

Kick in g Of f t h e
Fin FET Gen er at ion

lt has taken about 2 years longer


than we?d normally see, but the
next full generation of GPUs are
finally upon us. Powered by FinFET
based nodes at TSMC and
GlobalFoundries, both NVIDIA and
AMD have released new GPUs with
new architectures built on new
manufacturing nodes. AMD and
NVIDIA did an amazing job making
the best of 28nm over the 4 year
stretch, but now at long last true
renewal is at hand for the discrete
GPU market.
Back in May we took a first look at
the first of these cards, NVIDIA?s
GeForce GTX 1080 Founders
Edition. Launched at $700, it was
immediately the flagship for the
FinFET generation. Now today, at
long (long) last, we will be taking a
complete, in-depth look at the GTX
1080 Founders Edition and its
sibling the GTX 1070 Founders
Edition. Architecture, overclocking,
more architecture, new memory
technologies, new features, and of
course copious benchmarks. So
let?s get started on this belated look
at the latest generation of GPUs
and video cards from NVIDIA.

NEW GENERATION

L APTOPS
wenot needreasons
youjust buyit.

Ipineapple?

As a quick refresher, here are the


specifications for the new cards. At
a high level the Pascal architecture
(as implemented in GP104) is a mix
of old and new; it?s not a revolution,
but it?s an important refinement.
Maxwell as an architecture was very
successful for NVIDIA both at the
consumer
level
and
the
professional level, and for the
consumer iterations of Pascal,
NVIDIA has not made any radical
changes. The basic throughput of
the architecture has not changed ?
the ALUs, texture units, ROPs, and
caches all perform similar to how
they did in GM2xx.
Consequently the performance
aspects of consumer Pascal ? we?ll
ignore GP100 for the moment ? are
pretty easy to understand. NVIDIA?s
focus on this generation has been
on pouring on the clockspeed to

push total compute throughput to 8.9


TFLOPs, and updating their memory
subsystem to feed the beast that is
GP104.
GeFor ce GTX 1070
Meanwhile below the GTX 1080 we have
its lower price and lower performance
sibling, the GTX 1070. The standard
high-end salvage part, GTX 1070 trades
off fewer functional blocks and the lower
resulting performance in exchange for a
significantly lower price than the GTX
1080. From a hardware perspective, the
GTX 1070 utilizes GP104 with 1 of the 4
Graphics Processing Clusters (GPCs)
disabled. Relative to GTX 1080, this
knocks off around 25% of the
shading/texturing/compute
performance. However the memory
controllers and ROP partitions remain
untouched. With this configuration
NVIDIA is pitching the GTX 1070 as a full
generational update to the GTX 970, and
with any luck, the GTX 1070 will be as
well accepted as its extremely successful
predecessor.

As for memory, GTX 1070 doesn?t


get GDDR5X. Instead the card gets
8GB of GDDR5 running at 8Gbps.
This delivers a total memory
bandwidth of 256GB/sec, and
again unlike GTX 970, there is
nothing going on with partitions
here, so all of that memory and all
of that bandwidth is operating in
one contiguous partition, giving
the GTX 1070 an effective memory
bandwidth increase of 31%. GTX
1070 is the first NVIDIA card to
ship with 8Gbps GDDR5, a
memory speed I once didn?t think
possible. NVIDIA and the memory
partners are pushing GDDR5 to
the limit by doing this, but at this
point in time this is the most
economical way to boost memory
bandwidth without resorting to
more
exotic and
expensive
solutions like GDDR5X.GTX 1070 is
rated for a 150W TDP; this is a
smaller, 5W increase over its
predecessor.

BUY NEW GTXX 1069!!


Th is GPU m eet you r
n eeds! Bu y it N OW
in
www.al u expr ess.com
an d get it dr on e
del iver ed at you r
h om e!
Source:
http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review

Despite the official TDP, it should be


noted that NVIDIA is not pitching this
card as their 150W champion for
systems with a single 6-pin PCIe
power cable, and it will require a
more powerful 8-pin cable. For
systems that need a true sub-150W
card, this is where the GTX 1060 will
step in. Otherwise NVIDIA is making a
very interesting power play here what
is now the second most powerful
video card on the market does so on
just 150W.
Car ds, Pr icin g, & Availabilit y
For the GTX 1000 series, NVIDIA
has undertaken a significant change
in how they handle reference boards
and how those boards are priced.
What were once reference boards are
now being released as the Founders
Edition boards. These boards are
largely
similar
to
NVIDIA?s
last-generation reference boards,
built using a standard PCB and
NVIDA?s high-end blower cooler, along
with
some
additional
cooling
upgrades. The Founders Edition cards
will, in turn, not be sold at NVIDIA?s

general MSRP for each family, but


rather they will be sold as premium
cards for around $80-$100 more.
As a result we have two prices to
talk about. For the GTX 1080, the family
MSRP is $599. At the base level this is a
slight price increase over the GTX 980,
which launched at $549. As the
Founders Edition cards are not being
sold at this price, it is instead being
filled by semi and fully custom cards
from NVIDIA?s partners. These custom
cards offer a mix of designs, but at the
cheapest level (those cards closest to
the MSRP) we?re predominantly looking
at dual fan open air cooled cards. The
rest of the lineup is filled by more
advanced cards (including some closed
loop liquid coolers) with factory
overclocks and other features that are
sold at a premium price. The GTX 1080
Founders Edition card, for its part, fits
in to this picture at $699, a $100
premium. The story then is much the
same for the GTX 1070. Its family MSRP
is $379, which its Founders Edition
counterpart is being sold for $449.

Th e ECHO IV Hom e
Com pu t er
Science fiction writer William Gibson
famously stated, ?the future is already
here.
It?s just
not
very evenly
distributed.? Past events in the history of
technology
bear
this out: Doug
Engelbart, for example, blew the world
away in 1969 with his demonstration of a
futuristic working prototype of hypertext,
windows, the mouse, word processing,
videoconferencing and more. It took
another 30 or 40 years for the ideas
Engelbart showed to a stunned 60s
audience to become mainstream.
Similar ahead-of-the-curve experiments
took place with email, the German WW II
V-2 rocket program, and semiconductors,
among many other technical and
scientific disciplines.
People ?ahead of their time? is a common
trope in Western culture. From Thomas
Edison to Steve Jobs, certain people have
been able to see farther than anyone
else. In fact, Edison and Jobs share
something remarkable: they were
ecosystem
builders.
Edison,
as
important as perfecting the electric light
bulb, also created an infrastructure for
the bulb to live in? power plants,
massive amounts of wiring ? and he built
his first power station on Pearl Street in
NYC, right near his Wall Street investors.
Jobs, with Apple?s series of ?i? devices,
defined a software ecosystem of digital
content as important as the devices
themselves.

50 Year s Lat er : Fu t u r e is
alr eady h er e

Jim Sutherland inside a Westinghouse


Prodac-IV industrial process control computer
system.

Programming and interacting with


ECHO IV was accomplished by several
means: front-panel switches on the
main cabinet, a programmer ?s keypad
(for octal) near the main cabinet, a
paper tape reader and punch, and
the kitchen console, which was based
on an IBM 735 Selectric typewriter
and was used for word processing.
This latter ability deserves a closer
look. With ECHO IV, documents typed
on the Selectric keyboard could be
stored in ECHO IV?s memory, to be
reprinted later. Formatting changes
and page numbers could be
automatically added to printed
documents and, in 1975, ECHO IV was

used to format a 516-page scholarly


book on post-Revolutionary War land
grant surveys. Here again is Gibson?s
?unevenly distributed?future ? it would
be decades before people would be
doing word processing at home on
their own computer. Persistence of an
unchanging domestic hierarchy of
needs and the prescience of Jim
Sutherland whose unique computer
stimulated such thinking about the
future role of a computer in the home.
It took the technical ability and
personal family motivation of Jim
Sutherland to show us what might be
coming, fifty years ago.

IBM ?s
New
Qu an t u m
Pr ocessor
Tucked away at IBM Corp.?s T.J. Watson Research Center in a high-tech fridge cooled to
almost absolute zero is an experimental chip that could help advance scientific inquiry
further than any conventional supercomputer. It?s the culmination of a more than
three-decade research effort that the technology giant launched shortly after Richard
Feynman first proposed the idea of quantum computing in a 1982 paper.

Fr om t h eor y t o pr act ice


In his thesis, Feynman theorized that the
special laws governing subatomic particles
could be exploited to surpass the
capabilities of regular computers restricted
by classical physics. The core premise of the
technology has since been repeated
countless times in science journals: Whereas
a bit in a normal machine can only represent
one of two values, 1 or 0, its quantum
counterpart has a third possible state
wherein it?s set to 1 and 0 at the same time.
This quirk is owed to a phenomenon known
as quantum superposition that is illustrated
by the famous Schrdinger?s cat thought
experiment.

A qubit may thus represent three values


simultaneously, while a pair can be used to
represent seven with enough effort, three
can represent 15 and so on. And if 100
such qubits were to be implemented on
the same chip, as physicist Hans Robinson
postulated in a 2005 New Scientist article,
then the number of possible states could
climb to more than sexttillion.
That?s 1 followed by 30 zeros, which far
exceeds
the
capacity
of
current
supercomputers. As a result, a 100-qubit
computer would be excellent at performing
tasks that involve going through a large
amount
of
different
mathematical
combinations.

Br eak in g t h e bar r ier


Dubbed Quantum Experience, the demo
at the T.J. Watson Research Center
manages to fit five qubits on a single
processor by exploiting a breakthrough
that IBM engineers announced last April.
Their discovery provides a way of
correcting the errors that form in
semiconductors over time due to external
factors such as heat and background
radiation.
A qubit is vulnerable to two main types of
glitches: Bit-flips wherein a 1 changes to a
0 or vice versa and phase-flips, which
interfere with the relationship between
the two values when the qubit is in a
superimposed state. Theoretical physicists
led by MIT?s Peter Shor have developed
methods to fix both over the past two
decades, but historically, only one of the
errors could be corrected if they
manifested in a qubit at the same time. As
a result, a company that would have tried
building a five-qubit computer a few years
ago
would?ve
seen
its
system
overwhelmed
with
unchecked
data
corruption issues to the point of becoming
unusable.

By overcoming the problem, Quantum


Experience paves a path for bigger and
better quantum computers to be built in
the coming decades. But there are still
many more challenges that need to be
addressed before IBM?s vision can be
realized, not least of which is the task of
developing suitable software for quantum
computers.
After
all,
a
genome
sequencing or encryption application
created to run on conventional hardware
isn?t
exactly
equipped to
exploit
superimposed numerical states.

Clou d Savvy
To ensure that there will be software to
take advantage of quantum computers by
the time they become a reality, IBM is
making its chip accessible to the academia
through a free cloud service. The new
algorithms that researchers will develop
using the processor may very well end up
finding use in the quantum computers of
tomorrow if and when they arrive, and not
only in those developed by Big Blue.
Alphabet Inc., Canada?s D-Wave Systems
Inc. and a number of other companies are
also working to hurry along the quantum
computing revolution.

En t h r allin g vision s f or
t h e f u t u r e of com pu t in g
For years, our personal computers were made up of monitors, keyboards, and a big
beige box. Then laptops came along and changed everything? until a small, flat plate
of glass encased in metal, dubbed the iPhone, showed up and changed everything
again, followed shortly thereafter by an even larger plate of glass called the iPad that
changed things even more. As exciting as the iPad was, the original came to us five
years ago. Today, we once again face major shifts in for computing. What will that
future look like, both in the near term and the slightly further-off future?

The idea of a flexible display has actually


been around for a while now. LG was
In 2011, Microsoft conceived of a
talking up a bendable plastic for displays as
modular card system that would replace
early as 2010, and Samsung had a similar
your smartphone by 2019. The latest
technology around the same time that
video, called Productivity Future Vision,
made it into an actual product in 2013
still has the smartphone cards, and adds
called the Samsung Galaxy Round, though
some other deeply enthralling concepts.
that phone?s slight curve was far from being
One of them is a tablet made of a
a truly flexible display. And who can forget
mousepad-like material that you can
the impressive Plastic Logic e-reader demos
bend and fold. It basically looks like a
that popped up at conferences and
thick color e-ink touchscreen with no
tradeshows around 2008-2009? Plastic
bezel.
Logic even gave its e-reader a name, Que,
but the product ultimately failed to make it
to market.

Ben dy t ablet s

Bendy prototype by LG

Microsoft Hololens

Au gm en t ed r ealit y
A close cousin of virtual reality,
augmented reality is something we?ve
been playing with on smartphones for
years.

The easiest way to think of the


difference is that virtual reality
immerses you in a 100-percent digital
experience, while augmented reality
creates a digital overlay on top of the
physical world.
Microsoft?s HoloLens captures the
most attention in the augmented
reality realm these days. The device
may soon allow you to fight off
Minecraft Zombies and Creepers
coming at you from behind your
couch. It will allow medical students
to view a 3D model of the heart right
in the middle of the classroom, help
non-electricians successfully wire a
broken lightswitch, and much more.

By Ian Poll via


http://www.pcworld.com/article/2988179/computers/10-enthralling-visions-for-the-future-of-computing.html

Elon M u sk 's
Neu r al Lace
Ach ievin g Sym biosis Wit h
M ach in es
Elon Musk is having a hard time at the
moment. Amid all the sound and fury,
however, it?s sometimes easy to forget that
he?s constantly coming up with new,
visionary ideas, including the Hyperloop.
Another future endeavor that may have
been lost in the noise involves a so-called
?neural lace,? an interface that links human
brains with computer software.
After discussing the possibility of such a
device at Code Conference in California this
June, Musk took to Twitter to update the
world on the idea. He claims that a neural
lace will help humans ?achieve symbiosis
with machines,? a subset of a movement
known as transhumanism.
According to Inverse, Musk?s invention will
be a computer interface woven into the
brain, allowing the user to access, for
example, the Internet just by thinking, and
even perhaps store backups of a person?s
mind in case the person physically dies. By
being wirelessly enabled, the device could
allow us to write, paint, and communicate
just by thinking.

It could either be passive, representing


an implanted, glorified smartphone, or
it could be active and directly
communicate back and forth with our
mind by interfering with our brain?s
thought patterns. Musk is a firm
believer that artificial intelligence (AI)
will outmaneuver our own in the
future, and this could be seen as a way
of allowing us to ?team up? with it ? to
keep pace with it, so we aren?t left
behind.
One of them is the US military?s
scientific
division,
the
Defense
Advanced Research Projects Agency
(DARPA). Not content with developing
autonomous robotic soldiers capable
of empathy, or vampire drones that
disappear in sunlight, the secretive
military department has long been
interested in brain implants that ?fix?
neurological damage sustained in
warfare, and a neural interface is the
next step up from this.
The brain operates using electrical
signals, and although they are
generated biochemically, there?s no
reason why they shouldn?t be
compatible with computer systems.
The key difference is that a computer
system uses binary signals, whereas a
human brain converts billions upon
billions
of
bioelectrochemical
conductions into abstract thoughts
and concrete actions every single
second. You don?t have to be Musk to
realize that there?s a huge technical
gap that needs to be surmounted.

Still, thought-controlled prosthetics


are a real-life invention, so it?s not
unrealistic to think that, eventually,
humans
and
computers
could
communicate effectively.
However, these limbs move with a
moderate degree of precision based
on a few tens of thousands of electrical
neural impulses. The brain involves
magnitudes more than this, so at the
moment, the technology is relatively
primitive.

Making progress. Maybe something to announce in


a few months. Have played all prior Deus Ex. Not
this one yet. -Elon Musk

His latest tweet suggests that he?s


?making progress? on the idea,
and he may be about to
announce something in the next
few months. He does have a few
more pressing issues to navigate,
however, such as the deeply
unfortunate
SpaceX-related
fireball.

Neuron-Machine symbiosi

There?s a long way to go, then,


before this neural lace becomes a
reality. In the meantime, Musk is
clearly still keeping in touch with
his more grounded, Earth-bound
side.
-Joe Carmichael for
www.inverse.com

Cr edit s
Luis Gordo Soldevila, Carlos Romero Beltran and Alan Vicent Miralles

worked together to get this work done

M ETA L
SLUGS DO I T
A GA I N!!

ESC.
September 2016

Você também pode gostar