Ross: The Industries of the Future

The importance of a book like “The industries of the future”1 by Alec Ross can hardly be overstated.  By his own admission, “This industries_future_smallbook explores the industries that will drive the next 20 years of change to our economies and societies” (pg. 12). Whether or not the author succeeds in his ambitious task, surely he starts from quite a vantage point: former Senior adviser for innovation to Secretary of State Hillary Clinton, during the time he spent in the role he oversaw the transition to digital ecosystems of many an operation across the globe. The book is really a rich mine of potential showstoppers, giving lots of references to keep track of going forward. Due to the scope of the book, a review will be given here of its main undercurrents.

The problems addressed in the book naturally cluster into a neatly organized structure explored in another section of this blog. Here a quick summary:
  1. Robotics
  2. Genomics
  3. Algorithmic money, markets and trust
  4. Big Data
CH 1: HERE COME THE ROBOTS
WHAT. Cutting edge advances in the robotics landscape will be differentiated by country:
“Just as wealthier and poorer citizens reside at different technological levels, so do wealthier and poorer countries” (p.19) – where the “big five” (Japan, China, US, South Korea, Germany)  will be able to accrue huge benefit from their incoming preeminence in the robotics ecosystem.
WHY. What is the reason of this Cambrian explosion in the robotics ecosystem? Because of the confluence of enabling technologies, (p23): improvements in Belief Space (Bayes); cloud (swarm) robotics; new materials

The impact will be ubiquitous:  Automotive Industry (driverless cars: Google X), Operating Room (SEDASYS, also Nanorobots for cancer radiation), Academy (Aldebaran teaching computer science classes), Hospital and Human Care (therapeutic robots) etc.

The adoption pattern of those technologies will consist by initial high up-front capex (robot labor cost) that create offsetting savings in opex (human labor cost) (pgg. 37ff). Decisions like the Taiwanese Foxconn swapping robots for 1mio humans (pg. 35) – if scaled up, will create huge geographical tensions in f.e. China where a forced urbanization policy to keep labor cost down has been enacted by fiat for more than 30 years.
WHERE: Different countries will react to the shifting landscape in different ways:
while everywhere “the ratio of [capex and opex] will determine the future of work related patterns” (pg. 37), there will be places like Africa where the robotics revolution, because married to frugal innovation, will provide leapfrogging opportunities.
CH 2: THE FUTURE OF HUMAN MACHINE
The opening remark is wonderful: “The last trillion-dollar industry was built on a code of 1s and 0s. The next will be built on our own genetic code” (pg. 44) and “Genomics is going to have a bigger impact on our health than any single innovation of the 20th century” (pg. 74). Why is Ross so optimistic?
After the sequencing of the entire human genome (2000) “the size of genomic market” wyma96injyhrskhekdfetoday because of falling cost of sequencing and commercialization is where e-commerce was in 1994 (p 48). Think of PGDx Personal Genome Diagnostic or “liquid biopsy” – i.e. comparison of tumor cell with normal cells in same individual via Big Data analytics; think of CRISPR and designer babies; think of  Craig Venter’s latest projects: (a) Synthetic Genomics xenotransplantation (see here) and (b) Human Longevity Inc (p 62).
Here the old debate nature vs nurture risks resurfacing in a ghastly shape, whereby the socioeconomic fault lines (nurture) can be frozen in biological terms (nature). Moreover, and more concretely – by now this is not a Western-only enterprise: not no more. With Beijing Genomic Institute, China is willing to win the genomic battle, as US did win the internet race (pg. 66).
  • More books/talks on this here.
CH 3:  CODE-IFICATION OF MONEY, MARKETS, TRUST
This section argues that the code-ification and app-ification of money, markets, payments and trusts is a big inflection point for the disintermediation of large part of the current economy. Square, Alipay or Google-Wallet are the next iteration of digital money – while African based M-Pesa has succeeded in leapfrogging the banking system altogether in countries (like Kenya) where the physical infrastructure is lackluster or non-existent.

The big trend at work here is the interplay of dispersion and concentration: local communities of buyers and sellers are surely empowered by the availability of decentralized, peer-to-peer (payment) solutions (like M-Pesa) but at the same point the central routing of these transaction is operated in a progressively more and more centralized way.  “Coded markets like eBay and Airbnb simultaneously concentrate and disperse the market. With coded markets available to even the smallest vendors, a trend has arisen that pushes economic transactions away from physical stores and hotels toward individual people. .. .The route through which it is dispersed, however, redirects each of those transactions through a small number of technology platforms usually based in California or China” (pg. 93)

But arguably the deepest innovation coming from techno-utopianism in the markets, payment & trust ecosystem is Blockchain (original paper by Nakamoto; some links here).  An investor is quoted saying (pg. 115) that “the problem with the Internet from 1995 to 2010 was that it enabled information dissemination and communication but lacked any ability to transfer value between individuals. From 1995 to 2010 every industry in information services was transformed beyond recognition – newspapers, music, TV, etc. – as was any industry involved in communication and connection between individuals – phone fax auction recruiting etc. […] Conversely from 1995 to the present day there has been almost no impact by the Internet on the financial services or legal industries.”
The importance of Blockchain and distributed ledgers as an enabling ecosystem where smart contracts can render entire industries obsolete or radically disrupt their internal workings (like in the financial industry) deserves its own section, which will be regularly updated – hence the discussion on this can finish by quoting MIT Media Lab director Joy Ito (p 116): “My hunch is that the Blockchain will be to banking law and accountancy as the Internet was to media commerce and advertising. It will lower costs, dis-intermediate many layers of business and reduce friction. As we know one person’s friction is another person’s revenue”.
  • More books/talks on this here.
CH 5: DATA: THE RAW MATERIAL OF THE INFORMATION AGE
WHAT: A few figures first: “As recently as 2000, only 25 percent of data was stored in digital form. Less than a decade later, in 2007, that percentages had skyrocketed to 94 percent” (pg. 154). This is the dataquake.  “Big data is just the application of the commodification of computing power combined with the wider availability of cloud computing” (pg. 157).
The areas which will be visited by most action are;
–  Human interface in Machine Translation (pg 159): “Universal machine translation will accelerate globalization on a massive scale”. Advances in bioacoustic engineering will deliver sleek interfaces, no more robotic voice in the next 10 years;

–  Precision Agriculture: native, or retasked as Monsanto with FieldScripts (pg 162)

will reshape the agribusiness landscape: “The promise of precision agriculture is that it will gather and evaluate a wealth of real-time data on factors including weather, water and nitrogen levels, air quality, and disease – which are not just specific to each farm or acre but specific to each square inch of that farmland” (pg 162);
–  FinTech (pg 168ff). The financial industry is in essence an information processing operation.   “A bank is basically a giant ledger of contracts that have future positive and negative cash flows. A bank’s entire income is based on how the present value of those cash flows changes moment to moment” (pg. 170). FinTech arises because banks struggle to roll up their analytics to one central view of their cash flows. Standard Treasury, the “digital first” bank, aims to exactly that. Another FinTech area is the real-time screening: “In a coded-money economy, a lender knows a merchant’s true value because it has real-time access to its books” (pg. 171). Knowing all the transaction allows Square Capital (ibidem) to open credit lines and grow the business of its clients with unprecedented accuracy;

–  Our quantified selves: Delegating more and more of our decisions to non-human actors will trigger important questions regarding our agency. 4a48f0fc0642f24699278951c93f3770“Serendipity fades with everything we hand over to algorithms. Most of these algorithms are noiseless. They gently guide us in our choices. […] And because they constitute the value of a company’s intellectual property, there is an incentive to keep them opaque to us” (pgg. 180-1)
A couple more points: “When data goes from being unstructured to structured, it takes on the values and prejudices baked into its formulation” (pg 183) and “Correlations made by big data are likely to reinforce negative bias” (pg. 184). A thoughtful discussion on this -Dataism- is in an article by Yuval Harari, see Financial Times article.
  •  More books/talks on this here.
 Concluding remarks
Is Silicon Valley going to exert increasing gravitational pull or the decentralization triggered by commodification of big data ecosystems (AWS) will allow business to spread across the geographical avenues of domain expertise?  Where will be the focal points of the “next economy” and its accompanying class? Alpha cities (London, Tokyo, NY, Singapore) and places like Estonia “The Little Country that could” (see its e-residency scheme) and Israel, or the geographical gradient will be less steep?
“With these platforms the Valley has become like ancient Rome. It exerts tribute from all its provinces. The tribute is the fact it own these platform businesses. […] The value flows to one of the places of the world that can produce tech platforms. So the global regional inequality is going to be unlike anything we have ever seen”, argues an investor2 (p 94) in the book.
Will the vision of world leaders be commensurate to such scenarios? The big forces shaping our future – technology, platform & free-lance economy, environment no longer fall into old ideological divides. The twin faiths of the Age of Extremes – capitalism and communism – were both based on epistemological fallacies: the first that the randomness of the economic process could be eliminated in toto; the second that such randomness acts for the benefit of human society. “The principal political binary of the last half of the 20th century was communism versus capitalism. In the 21st century it is open versus closed” argues Ross (pg 215). On such a hopeful note our review of Alex Ross wonderful book terminates. It should be mandatory reading for all thinking people.

1. [ Ross, Alec. The industries of the future Simon & Schuster , 2016]
2. [ Charlie Songhurst, see here]

Video Transcoding: Handbrake + libdvdcss

Suppose you had your collection of DVDs which you lawfully bought in the marketplace.
Suppose you wanted to see them when you are running in the gym, on your iPad. You need to transcode them, i. e. make the conversion into another format (in my case .mp4).

Surprisingly, this simple problem is not that tractable, unless you commit money to some expensive proprietary software solution. The issue is the Content Scramble System, a lawful encryption system that scrambles the contents of the DVD and messes up the reading of them.

A simple transcoder – an utility that transitions a format into a different one – may fail if the encryption is not taken care of.  And this is exactly what happens with a simple minded usage  of the marvellous free & open source transcoder Handbrake

handbrake

Handbrake needs the crucial library that inverts the scramble, libdvdcss.

Here is then the solution with respect to Handbrake 0.10.5.0 (64 bits) on Windows:
Download libdvdcss-2.dll from
http://download.videolan.org/libdvdcss/1.2.11/win32/libdvdcss-2.dll (32 bit version) or http://download.videolan.org/libdvdcss/1.2.11/win64/libdvdcss-2.dll (64 bit version).

Then, move the libdvdcss-2.dll into your Handbrake install directory (usually C:\Program Files\Handbrake\).
Enjoy transcoding!

Feed-forward networks and teleology

Bertrand Russell, in “History of Western Philosophy” pgg. 86-87, writes:

The atomists, unlike Socrates, Plato, and Aristotle, sought to explain the world without introducing the notion of purpose or final cause. The “final cause” of an occurrence is an event in the future for the sake of which the occurrence takes place. In human affairs, this conception is applicable. Why does the baker make bread? Because people will be hungry. Why are railways built? Because people will wish to travel. In such cases, things are explained by the purpose they serve. When we ask “why?” concerning an event, we may mean either of two things. We may mean: “What purpose did this event serve?” or we may mean: “What earlier circumstances caused this event?” The answer to the former question is a teleological explanation, or an explanation by final causes; the answer to the latter question is a mechanistic explanation. I do not see how it could have been known in advance which of these two questions science ought to ask, or whether it ought to ask both. But experience has shown that the mechanistic question leads to scientific knowledge, while the teleological question does not. The atomists asked the mechanistic question, and gave a mechanistic answer. Their successors, until the Renaissance, were more interested in the teleological question, and thus led science up a blind alley.
 

Is a feed-forward network (or any inverse problem) going to change this in a qualitative way? The mind goes again to Norbert Wiener, in his 1943 article “Behavior, purpose, Teleology”. McCulloch and Pitts, with their logical calculus of nervous system, were round the corner.

Market completion with Wishart variance

The following page documents the simulation happening in the model described here, i.e. the Wishart based stochastic volatility. The parameters for the Euler-Maruyama simulation are:

S_0 1.
T 1.
\Delta t 0.01.
d 2
\alpha 4.8

wishart_paths_20

 

wishart_paths_100

Example of Arbitrage free volatility surface (dynamics of vol surface) generated by the model (here x axis is tenor structure, y axis is strike level and z axis call option price):

The simulation of Wishart varinace process is accomplished via the following code:

function [S,eigS,timestep,minh]=Wishart(T,h,d,alpha,Q,M,s_0)
%
% Simulates the d-dimensional Wishart process on the interval [0,T]
% that follows the stochastic differential equation
% dS_t=sqrt{S_t}*dB_t*Q+Q'*dB_t'*sqrt{S_t}+(S_t*M+M'*S_t+alpha*Q'*Q)dt
% with initial condition S_0=s_0.
%
% Method of discretization: Euler-Maruyama
% Starting step size: h
% In order to guarantee positive semidefiniteness of S, the step size will be
% reduced iteratively if necessary.
%
% Output:
% S is a three dimensional array of the discretized Wishart process.
% eigS is a matrix consisting the eigenvalues of S
% timestep is the vector of all timesteps in [0,T]
% minh is the smallest step size used, i.e. minh=min(diff(timestep))
%--------------------------------------------------------------------
horg=h;
minh=h;
timestep=0;
[V_0,D_0]=eig(s_0);
eigS=sort(diag(D_0)');
% to ensure evolution in the right space
drift_fix=alpha*Q'*Q;


B_old=0;
B_inc=sqrt(h)*normrnd(0,1,d,d);
% not clear why he needs Q : the correlation structure?
vola=V_0*sqrt(D_0)*V_0'*B_inc*Q;
drift=s_0*M;
S_new = s_0+vola+vola'+(drift+drift'+drift_fix)*h;
[V_new,D_new]=eig(S_new);
eigS=[eigS;sort(diag(D_new)')];

% concatenate along third dimension
S=cat(3,s_0,S_new);
t=h;
timestep=[timestep;t];
flag=0;1
while t+h<T,
	B_old=B_old+B_inc;
	B_inc=sqrt(h)*normrnd(0,1,d,d);
	S_t=S_new; V_t=V_new; D_t=D_new;
	sqrtm_S_t=V_t*sqrt(D_t)*V_t';
	vola=sqrtm_S_t*B_inc*Q;
  %vola= V_t*sqrt(D_t)*V_t'*B_inc*Q;
	drift=S_t*M;
	S_new = S_t+vola+vola'+(drift+drift'+drift_fix)*h;
	[V_new,D_new]=eig(S_new);
	mineig=min(diag(D_new));
  
  % check whether the minimum eigenvalue is negative
	while mineig<0,
		h=h/2;
		minh=min(minh,h);
		flag=1;
		if h<eps, error('Step size converges to zero'), return, end
		B_inc=0.5*B_inc+sqrt(h/2)*normrnd(0,1,d,d);
		vola=sqrtm_S_t*B_inc*Q;
		drift=S_t*M;
		S_new = S_t+vola+vola'+(drift+drift'+drift_fix)*h;
		mineig=min(eig(S_new));
	end
	if flag==0,
		eigS=[eigS;sort(diag(D_new)')];
		S=cat(3,S,S_new);
		t=t+h;
		timestep=[timestep;t];
	end
	if flag==1,
		[V_new,D_new]=eig(S_new);
		eigS=[eigS;sort(diag(D_new)')];
		S=cat(3,S,S_new);
		t=t+h;
		timestep=[timestep;t];
		flag=0;
		h=horg;
	end
end
h_end=T-t;
if h_end>0,
	B_inc=sqrt(h_end)*normrnd(0,1,d,d);
	S_t=S_new; V_t=V_new; D_t=D_new;
	vola=V_t*sqrt(D_t)*V_t'*B_inc*Q;
	drift=S_t*M;
	S_new = S_t+vola+vola'+(drift+drift'+drift_fix)*h;
	S=cat(3,S,S_new);
	eigS=[eigS;sort(eig(S_new)')];
	timestep=[timestep;T];
end

When that is done, this simulates the a single Euler-Maruyama realization of a path:

function [X]=EulerSinglePath(T,h,S_0)
% simulation of Wishart based process
% parameters are drawn from Gauthier and Possamai paper 
% Efficient Simulation 

% dimension of the Wishart variance process
d = 2;
alpha =4.8;
% initial log spot
X_0 = log(S_0);

% Variance initial level
s_0 = [0.5,  0.; 0., 0.5];
% drift correction
Q   = [0.35, 0.; 0., 0.4];
% drift component
M   = [-0.6, 0.; 0., -0.4];
% correlation
R   = [-0.5, 0.; 0.,-0.5];

% Simulation of wishart variance
[S,eigS,timestep,minh] = Wishart(T,h,d,alpha,Q,M,s_0);

 % after simulation of matrix valued process, simulation of asset is given here

% precomputing the correlation structure
IminRR =  eye(d,d)-R*R';
[V_ImRR,D_ImRR]=eig(IminRR);
sqImR=V_ImRR*sqrt(D_ImRR)*V_ImRR';


% ciclyng in the third dimension of the matrix
n_steps = size(S)(3);
X = zeros(1, n_steps);
% setting the initial log asset level
X(1) = X_0;

for n =2:n_steps
  TrS_t = trace(S(:,:,n));
  drift = -0.5*TrS_t*h;
  
  % VOLATILITY
  [V_t,D_t]=eig(S(:,:,n));
  % the formula below corresponds to
  % Z_t := W_tR^T + B_t \sqrt{I_d − RR^T}
  W_t=sqrt(h)*normrnd(0,1,d,d);
  B_t=sqrt(h)*normrnd(0,1,d,d);
  Z  = W_t * R' + B_t * sqImR;
  
  vola=V_t*sqrt(D_t)*V_t'*Z;
  
  X_old = X(n-1);
  X(n) = X_old + drift + trace(vola);
  X_old=X(n);
end

 


NOTES


Rogers, Tehranchi, Can the implied volatility surface move by parallel shifts?

Octave meshgrid, surf and video slideshow

Since the release of the splendid GUI, Octave has become again one of my favorite tools. Here is a simple and hopefully useful application. The following code is quite self-explanatory:

tx = ty = linspace (-4, 4, 41)';
[xx, yy] = meshgrid (tx, ty);
for ii=1:23
  alpha = 0.01*ii;
  tz = exp(-alpha*(xx .^2 + yy .^2 ));
  surf(tx, ty, tz);
  tit = strcat ("alpha=", num2str(alpha));
  title(tit);
  fname = sprintf("anim_%02i.jpg", ii);
  print (fname)
  ans = yes_or_no ("prompt")
endfor

 

After that, having installed ffmpeg, just issue from either bash or cmd shell:

ffmpeg.exe  -framerate 2 -start_number 1 -i "anim_%02d.jpg" -vcodec mpeg4 evolution.mp4

 

to obtain:

 

Ipython: simple finite differences

Ipython as a prototyping tool for scientific computation is very neat and useful indeed.
After the Win 8 minimal installation reviewed in a previous post, now we are going for some simple Crank Nicolson of a parabolic PDE. We are going to have non-degenerate diffusions, as we don’t want the problem to become hyperbolic and to lose parabolic smoothing.

To run the notebook, one simply needs some more libraries, “numpy” and “matplotlib”.
They can be obtained as precompiled “wheel” files courtesy of Christoph Gohlke
from here.

After the relevant wheel files have been downloaded, simply run cmd (as administrator) and issue
from the directory where they have been saved:

pip install "numpy_my_version.whl"
pip install "matplotlib_my_version.whl"

(where of course the suffix “my_version” has to be changed into whatever the version is).
This zip contains a modified version of some buggy notebook found over the internet. It is now fully working in Python >= 3.x.
After uploading in the usual way, run it. The Finite Difference stencil is easily computed, and the heatmap is nicely displayed. Latex is there as well to document the mathematical steps.
Literate Programming at its best.

Ipython in Windows 8: “Hello World”

This post documents the installation of Ipython in Windows 8.
As an example of Don Knuth’s Literate Programming, Ipython is simply great.
One can devise the mathematical equations of a model, code the numerics and run the program against data.The full lifecycle of science, in a single sheet.

1) Install the “Python” runtime. Version 3.x is recommended, from this link.

2) Suppose that the version has been installed in: “C:\Python3.4”. Add this directory as well as “C:\Python3.4\Scripts” to the system path

3) Save this file in the same directory “C:\Python3.4”, and running cmd as administrator, from a shell issue

python ez_setup.py

4) A C compiler is needed to compile extensions. Among various choices this is the simplest:

5 ) Create a vcvars64.bat file in C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin\amd64 that contains :

CALL "C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\SetEnv.cmd" /x64

6)  Issue as admin

easy_install ipython[all]

then

pip install markupsafe

Finally, run it

 
ipython notebook

After that, the page

"http://localhost:8888/tree"

will open in the browser and  under the heading Files->Upload

unzip this minimal ipython notebook and load it. The first ipython program is up: it can be executed and modified intereactively.