Você está na página 1de 7

Verbetes para uma campanha no caos

Analytics
Black Mirror – Waldo
Caos – “Kronos (o tempo) é um caos temporariamente organizado”
microtargeting
Peter Sloterdijk reconstruiu a história política da cólera

Simone Lenzi – troco


Instituto de Tecnologia de Massachusetts (MIT) demonstrou que uma
falsa informação tem, em média, 70% a mais de probabilidade de ser
compartilhada na internet, pois ela é, geralmente, mais original que uma
notícia verdadeira. Segundo os pesquisadores, nas redes sociais a verdade
consome seis vezes mais tempo que uma fake news para atingir 1.500
pessoas

uma empresa de redes sociais deve, sobretudo, fazer as coisas de maneira


que eles se enervem, sintam-se em perigo ou tenham medo. A situação
mais eficaz é aquela em que os usuários entram em estranhas espirais de
um consenso muito poderoso ou, ao contrário, de sério conflito com
outros usuários. Isso não acaba jamais, e é esse, exatamente, o alvo

São os outros que são incitados a fazer o trabalho sujo. Como os jovens
macedônios que completam seu orçamento mensal postando fake news

1
envenenadas. Ou mesmo os americanos ansiosos por faturar um dinheiro
extra.”

Na França, o movimento dos Coletes Amarelos se alimenta desde o


começo de dois ingredientes: a raiva de certos segmentos populares e o
algoritmo do Facebook
“clubes da luta dos covardes”, na precisa definição de Marylin Maeso.
Uma máquina temível que se nutre de raiva e que se nutre de raiva e tem
como único princípio o engajamento de seus partidários. O importante é
alimentá-la
permanentemente com conteúdos “quentes”, que suscitam emoções.
Atrás do escritório de Davide Casaleggio, em Milão, um monitor mede, em
tempo real, os índices de popularidade dos conteúdos postados nas
diferentes plataformas digitais da galáxia do Movimento 5 Estrelas. Sejam
eles positivos ou negativos, progressistas ou reacionários, verdadeiros ou
falsos, não importa. Os conceitos que agradam são desenvolvidos e
relançados, transformando-se em campanhas virais e em iniciativas
políticas.

Restitui
Eu quero de volta o que é meu
O tempo q roubado foi
David Sassi

Na Itália, a Casaleggio Associati elaborou, a partir de 2014, um manual de


conduta para os eleitos do Movimento 5 Estrelas que participam de
programas de TV. Quanto ao tema dos imigrantes, o texto sugere adotar a
seguinte atitude:
“O assunto imigração

2
suscita muitas emoções, entre as quais, primeiro, o medo e a cólera.
Assim, na televisão, começar a argumentar, explicar os tratados ou
mesmo propor soluções mais ou menos realistas é inútil. As pessoas estão
tomadas por suas emoções e se sentem ameaçadas, assim como suas
famílias. Não se pode pretender que elas acompanhem um discurso
puramente racional.”
O vade mecum do M5S conclui sugerindo a estratégia do “decalque – ou
transferência – emocional”:
“Nós somos uma saída para a raiva e o medo”.

"clustering is in the eye of the beholder."[5]


Estivill-Castro, Vladimir (20 June 2002). "Why so many clustering
algorithms – A Position Paper". ACM SIGKDD Explorations Newsletter. 4
(1): 65–75. doi:10.1145/568574.568575.

find a local optimum

Mean-shift is a clustering approach where each object is moved to the


densest area in its vicinity

density attractors

to trade semantic meaning of the generated clusters for performance

to better understand the relationships between different groups of


consumers/potential customers, and for use in market segmentation,
product positioning, new product development and selecting test
markets.

3
use various means of communication such as direct mail, phone calls,
home visits, television, radio, web advertising, email, and text messaging,
among others, to communicate with voters, crafting messages to build
support for fundraising, campaign events, volunteering, and eventually to
turn them out to the polls on the election day.

Microtargeting's tactics rely on transmitting a tailored message to a


subgroup of the electorate on the basis of unique information about that
subgroup

in Iowa the (2004) campaign was able to reach 92% of eventual Bush
voters (compared to 50% in 2000) and in Florida it was able to reach 84%
(compared to 50% in 2000).[4] Much of this pioneering work was done by
Alex Gage and his firm, TargetPoint Consulting.

large and sophisticated databases that contain data about as many voters
as possible. The database essentially tracks voter habits in the same ways
that companies like Visa track consumer spending habits

These databases are then mined to identify issues important to each voter
and whether that voter is more likely to identify with one party or
another. Political information is obviously important here, but consumer
preferences can play a role as well. Individual voters are then put into
groups on the basis of sophisticated computer modeling. Such groups
have names like "Downscale Union Independents", "Tax and Terrorism
Moderates," and "Older Suburban Newshounds."[4][7]

What is microtargeting?
https://blog.mozilla.org/internetcitizen/2018/10/04/microtargeting-
dipayan-ghosh/

4
Microtargeting is a marketing strategy that uses people’s data — about
what they like, who they’re connected to, what their demographics are,
what they’ve purchased, and more — to segment them into small groups
for content targeting. It’s the reason that if you typically shop at Whole
Foods, you may be served an advertisement for organic sunscreen during
the Summer. And while it can help deliver content that is interesting and
helpful to you, it also has a dark side — especially if it delivers information
that’s inaccurate or biased and meant to sway your vote.

two key facts to consider about traditional media:

1. Traditional media platforms push content (e.g., political ads on a TV


network) to very large – broad — audiences.
2. Political ads disseminated over traditional media outlets necessarily
receive tremendous public scrutiny for compliance with federal
election regulations, a fact that naturally assures that ads that appear
on traditional media usually do not contain falsehoods.

A filter bubble – a term coined by Internet activist Eli Pariser – is a state of


intellectual isolation that allegedly can result from personalized searches
when a website algorithm selectively guesses what information a user
would like to see based on information about the user, such as location,
past click-behavior and search history.[2][3][4] As a result, users become
separated from information that disagrees with their viewpoints,
effectively isolating them in their own cultural or ideological bubbles.[5]
The choices made by these algorithms are not transparent.[6] Prime
examples include Google Personalized Search results and Facebook's
personalized news-stream. The bubble effect may have negative
implications for civic discourse, according to Pariser, but contrasting views
regard the effect as minimal and addressable. The results of the U.S.
presidential election in 2016 have been associated with the influence of
social media platforms such as Twitter and Facebook,[9][10] and as a
result have called into question the effects of the "filter bubble"
phenomenon on user exposure to fake news and echo chambers, spurring
new interest in the term, with many concerned that the phenomenon may
harm democracy.

5
(Technology such as social media) “lets you go off with like-minded
people, so you're not mixing and sharing and understanding other points
of view ... It's super important. It's turned out to be more of a problem
than I, or many others, would have expected.”
— Bill Gates 2017 in Quartz
According to Pariser, the detrimental effects of filter bubbles include harm
to the general society in the sense that they have the possibility of
"undermining civic discourse" and making people more vulnerable to
"propaganda and manipulation". He wrote:

A world constructed from the familiar is a world in which there's nothing


to learn ... (since there is) invisible autopropaganda, indoctrinating us with
our own ideas.
— Eli Pariser in The Economist, 2011[30]

Many people are unaware that filter bubbles even exist. This can be seen
in an article on The Guardian, which mentioned the fact that "more than
60% of Facebook users are entirely unaware of any curation on Facebook
at all, believing instead that every single story from their friends and
followed pages appeared in their news feed."[31] A brief explanation for
how Facebook decides what goes on a user's news feed is through an
algorithm which takes into account "how you have interacted with similar
posts in the past."

A filter bubble has been described as exacerbating a phenomenon that


has been called splinternet or cyberbalkanization,[Note 1] which happens
when the Internet becomes divided up into sub-groups of like-minded
people who become insulated within their own online community and fail
to get exposure to different views.

Organizations such as the Washington Post, The New York Times, and
others have experimented with creating new personalized information

6
services, with the aim of tailoring search results to those that users are
likely to like or agree with.

Você também pode gostar