Escolar Documentos
Profissional Documentos
Cultura Documentos
Simon Wistow
This is me
These are some of the places Ive worked. Some of them were even cool at the time.
Ive worked at a whole bunch of other startups. Many of them are dead.
I also used to work doing VFX for lms including this cheeky chappy.
This handsome devil (as me after for the story involving the chocolate river)
But in Australia. And yes, I did have to wear chaps. Although I am at pains to point out that ALL chaps are assless. Otherwise theyre just leather trousers. I also had to shave my legs. True Story!
The problem is that they all tend to slavishly follow the Gartner Hype Cycle
Then comes the inevitable backlash. A quick straw poll - I want to store my users passwords - what should I use?
Of course, bcrypt (or equivalent). But lets go back to node.js ... what might be the problem?
10 users trying to log in simultaneously when you have a work factor of 12 will mean that person 10 will take about 3-5 seconds to log in. Thats only 10 people.
See, also, for example the whole MongoDB is webscale The problem is that people dont actually think about their problems.
Or this one (or any of their similar ones). The problem is that people didnt seem to understand them - they thought of them as recipes not biographies and instead enshrined them as a set of rules Thou shalt shard, Thou shalt use memcached
So, what has this got to do with Fastly? A little about us - were a next generation content delivery network. We can cache dynamic as well as static content. Were real time and we give you insane amounts of control over a request life cycle.
I am contractually obliged to include this joke. Its part of the Secret Guild of CDN providers rules.
http://www.fastly.com/demo
This is a demo of our real time stats showing a subset of our customers. We get around 90-95% hit rate depending on what sort of day some customers are having. Out time to rst byte is <1ms on the 99% and less than 300 microseconds on the 95%. We currently have 8 datacenters with new ones opening in new territories soon.
Were pathologically lazy. Out of interest - anybody seen the Halloween Gareld cartoon? Its insane.
We famously use SSDs everywhere. But SSDs are expensive? Not if you count IOps/$
We try and do less with more and we really try and understand the full stack. Were willing to put the effort in where necessary. We wrote what amounts to our own Filesystem. And because we dont trust Linux we have a custom Kernel build which makes fsync a noop.
gethostbyname(3)
Another example of stuff were willing to do ... How many of you have done any assembler? [ INSERT assembler dispatch anecdote ]
However theres some stuff we just dont care about. The speed of our API just isnt an issue. We happen to use Unicorn. When people start doing microbenchmarks of Unicorn vs $Whatever it makes me sigh ...
Far better off with subjective wins, like making sure your site *feels* more responsive.
We use MySQL cluster. This is not a picture of MySQL cluster but its far more visually interesting. [ INSERT Pros and Cons of MySQL Cluster ] [ INSERT Why not Riak etc etc ]
Which led to Artur tweeting this [ INSERT anecdote about ZeroMQ ] [ INSERT anecdote about switching to HTTP ]
The games industry generates stuff like this 60 times a second. We concatenate a bunch of strings together twice a second and call it good.
Another example: lets go back to messaging again. These ne fellows are one of the top 5 websites in the world. They are primarily a messaging website.
By far their number 1 messaging moment came when Castle In The Sky, a 12 year old anime by Miyazaki was aired on Dec 9th, 2011. They peaked 25,088 messages a second (by contrast the next biggest, the Euros nal got about 15,358) Lets assume that thats inbound tweets not total tweets and assume say a 10x fan out.
In contrast take this story from Forbes magazine in 2009 about NASDAQ. http://www.forbes.com/forbes/2009/0112/056.html Obviously thats not exactly an apples to apples comparison but ...
Lets talk about big data. Every time people on Hacker News talk about Big Data an somewhat at CERN keels over with sadness. Its true. I read it on Digg. Facebook has 40 billion photos. Walmart does over 1 million transactions an hour. They have a database with over 2.5 Petabytes. The LHC collected 13 Pbs of data.
I dont mean to sound relentlessly negative though. As developers were dealing with ever increasing amounts of data - when I was building the search engine for LiveJournal in 2006 it was designed to deal with at least 6 billion objects. Thats only half of what Yahoo! and Google had in 2001.
So playing around with new stuff is a good way to stop your brain atrophying and in the future these kind of techniques are going to be vital.
But lets get some perspective, dont get mired in the X SUX, Y IS TEH WIN! mentality and remember that over focussing on one particular detail is not nearly as good as having an overall understanding of how everything ts in together.
Dont slavishly copy something just because someone else is doing it. The best technique is to analyze your problems and solve them instead.
Be language agnostic Be technology agnostic Join the cult of DevOps Look to the past Be open minded about the future