Keynote Speakers
- Josh Bongard
- Jean Philippe Bouchaud
- Rodney Douglas
- Artur Ekert
- Neelie Kroes
- John Pendry
- Gabor Proszeky
- Claire Tomlin
Rodney Douglas
Computational Neuroscience, Neuroinformatics, Neuromorphic Systems
Title: "Constructive Cortical Computation"
Abstract: During the past century ever more sophisticated methods have been developed for constructing and programming computing and manufacturing machines. However, these methods are essentially forward processes that depend on intelligent human designers and programmers. They stand in stark contrast to Biology's methods of self-construction used to evoke the flexible information processor that is the mammalian neocortex. Understanding this radically different approach that uses algorithmic self-programming and construction could have enormous consequences for future computing and manufacturing technologies. In this talk we describe progress towards understanding these principles through detailed simulation of the development of the neocortex.
Constructive Cortical Computation
Institute of Neuroinformatics
UZH and ETH Zurich
Neelie Kroes
Vice President of the European Commission
Neelie Kroes is no stranger to ICT research and innovation, having been involved in funding such work since the early 1980s.
Biography: From February 2010 Kroes has been a Vice President of the European Commission and is leading implementation of the Digital Agenda for Europe - the EU's new comprehensive action plan to harness ICTs to drive growth and address social challenges. Kroes will be mobilising industry, national governments, other stakeholders and her colleagues to deliver on 31 pieces of legislation and 101 targets by the end of her term in 2015. At the top of the list is the EU's commitment to deliver 'broadband for all'
Known for her pragmatic and straight-talking approach, Kroes has a formidable delivery record. She has also won fans as an ongoing champion of open data and open standards.
Prior to her roles in Brussels, Kroes was President of Nyenrode University from 1991-2000, and served on the boards of a string of major companies such as Lucent Technologies, Volvo, and P&O Nedlloyd.
Kroes hails from the liberal VVD Party in the Netherlands, and served as a national minister of transport and telecommunications in the 1970s and 1980s. Kroes has been a Knight of the Order of the Netherlands Lion since 1981 and a Grand Officer of the Order of Orange-Nassau since 1989.
Artur Ekert
Quantum information technology
Title: "Is the age of computation yet to begin?"
Abstract: The theory of classical universal computation was laid down in 1936, was implemented within a decade, became commercial within another decade, and dominated the world's economy half a century later. This success story relied on the progress in technology. As computers become faster they must become smaller. The history of computer technology has involved a sequence of changes from one type of physical realisation to another - from gears to relays to valves to transistors to integrated circuits and so on. The unavoidable step to the quantum level will be one in this sequence; but it promises something more exciting as well. For the first time since the invention of the general purpose computer, a change in underlying hardware can give computers qualitatively new functionality. Quantum theory is already important in the design of microelectronic components. Soon it will be necessary to harness quantum theory, rather than simply take it into account. I will describe our quest to understand quantum theory, our efforts to develop quantum technology to support quantum computation, and our surprise and excitement once we discovered that nature already employs coherent quantum phenomena in biological systems. There is so much potential in this fundamentally new way of harnessing nature that it appears as though the age of computation has not yet even begun!
Biography: Artur Ekert studied physics at the Jagiellonian University in Kraków and at the University of Oxford.
Between 1987 and 1991 he was a D.Phil. student at Wolfson College, University of Oxford. In his doctoral thesis (Oxford, 1991) he showed how quantum entanglement and non-locality can be used to distribute cryptographic keys with perfect security.
In 1991 he was elected a Junior Research Fellow and subsequently (1994) a Research Fellow at Merton College, Oxford. At the time he established the first research group in quantum cryptography and computation, based in the Clarendon Laboratory, Oxford. Subsequently it evolved into the Centre for Quantum Computation, now based at DAMTP in Cambridge.
Between 1993 and 2000 he held a position of the Royal Society Howe Fellow.
In 1998 he was appointed a Professor of Physics at the University of Oxford and a Fellow and Tutor in Physics at Keble College, Oxford.
From 2002 until early 2007 he was the Leigh-Trapnell Professor of Quantum Physics at the Department of Applied Mathematics and Theoretical Physics, Cambridge University and a Professorial Fellow of King's College, Cambridge.
Since 2007 he has been a Professor of Quantum Physics at the Mathematical Institute, Oxford University, and a Lee Kong Chian Centennial Professor at the National University of Singapore. For his discovery of quantum cryptography he was awarded the 1995 Maxwell Medal and Prize by the Institute of Physics and the 2007 Hughes Medal by the Royal Society. He is also a co-recipient of the 2004 European Union Descartes Prize. He has worked with and advised several companies and government agencies.
Artur Ekert's research extends over most aspects of information processing in quantum-mechanical systems, with a focus on quantum cryptography and quantum computation. Building on the idea of quantum non-locality and Bell's inequalities he introduced entanglement-based quantum key distribution in his 1991 paper which generated a spate of new research that established a vigorously active new area of physics and cryptology, and is still the most cited paper in the field. His subsequent work with John Rarity and Paul Tapster, from the Defence Research Agency (DRA) in Malvern, resulted in the proof-of-principle experimental quantum key distribution, introducing parametric down-conversion, phase encoding and quantum interferometry into the repertoire of cryptography.He was the first to develop the concept of a security proof based on entanglement purification.
Ekert has made a number of pioneering contributions to both theoretical aspects of quantum computation and proposals for its experimental realisations. These include proving that almost any quantum logic gate operating on two quantum bits is universal, proposing one of the first realistic implementations of quantum computation, e.g. using the induced dipole-dipole coupling in an optically driven array of quantum dots, introducing more stable geometric quantum logic gates, and proposing "noiseless encoding", which became later known as decoherence free subspaces.
His other notable contribution include his work on quantum state swapping, optimal quantum state estimation and quantum state transfer. He is also known for his work on connections between the notion of mathematical proofs and the laws of physics and his semi-popular writing on the history of science.ES
Claire Tomlin
Software technologies, computer technology, computational complexity, agent-based systems
Title:"Mathematical models to help understand developmental biology and cancer"
Abstract: As the understanding of cellular regulatory networks grows, system dynamics and behaviors resulting from feedback effects of such systems have proven to be sufficiently complex so as to prevent intuitive understanding. Mathematical modeling in engineering and in physics or chemistry has traditionally sought to extrapolate from existing information and underlying principles to create complex descriptions of various systems, which could be analyzed or simulated, and from which further abstractions could be made. However, in studying biological systems, often only incomplete abstracted hypotheses exist to explain observed complex patterning and functions.
The challenge has become to show that enough of a network is understood to explain the behavior of the system. Mathematical modeling must simultaneously characterize the complex and nonintuitive behavior of a network, while revealing deficiencies in the model and suggesting new experimental directions. In this talk, we describe the process of modeling two biological networks: planar cell polarity in development, and treated regulatory networks in breast cancer. We demonstrate the use of the mathematical models, both in understanding the system behavior, and in suggesting new treatments.
Biography: In September 1998, Claire Tomlin received her Ph.D. from the Department of Electrical Engineering and Computer Sciences at Berkeley. She received my B.A.Sc. in Electrical Engineering from the University of Waterloo, Canada, in 1992, and her M.Sc. in Electrical Engineering from Imperial College, London, England, in 1993.
Claire Tomlin's research areas cover systems science – stochastic hybrid dynamics, control; applications to systems biology (Claire Tomlin just spent 6 month sabbatical at the Karolinska Institute working with cancer researchers), avionics, flight control. Claire is a McArthur Fellow (this is the "genius award" for young researchers in the US).
Gábor Prószéky
human language technologies, natural language processing, parallel applications, neuro-linguistics
Title: "The (hopefully near) future of human language technologies"
Abstract: Today’s language technology applications usually rely on either human-designed rules (used sequentially by computers) or large amount of sequential data, that is, spoken or textual corpora. Today, computer modeling of human language abilities does not use parallel methods. In current natural language processing paradigms the notion of parallelism is almost totally missing. Multi-core processors are nowadays available even in commercial computers. On the other hand, results of brain research are quite far from existing language technology applications. Applying parallelism would lead us to a more realistic architecture for language understanding, with an increased processing speed.
Biography: Gábor Prószéky, CEO of MorphoLogic, the leading Hungarian language technology company, and professor and vice-dean of the new Faculty of Information Technology at the Pázmány Péter Catholic University, Budapest. He graduated at the ELTE University both in software engineering and in general & applied linguistics. He holds a PhD (1994) in computational linguistics. In 2005 he received the title Doctor of the Hungarian Academy of Sciences.
Since his university years, he has been involved in more than thirty R&D projects in human language technologies (HLT), and computational and theoretical aspects of humanities. Aside of more than 130 scientific publications mainly on HLT, he is the author of three comprehensive books on human language technologies. Among others, he is a president of the Lexicographical Committee of the Hungarian Academy of Sciences, board member of the European Language Resources Association and member of various international and national associations and committees.
In 1991, with software engineer colleagues working on human language technology applications, he founded MorphoLogic, the first language industrial company in Hungary. Since then, MorphoLogic’s various applications have been licensed by Microsoft, IBM, Xerox, among others.
Gábor Prószéky has led MorphoLogic’s research and development in numerous language technology projects supported by the European Commission. In 1999, MorphoLogic won the IST Prize of the European Commission. In 2000 Gábor Prószéky received Hungary’s highest award, the Széchenyi Prize for his activities. Among others, he also received Kalmár Award of the John von Neumann Computer Society (1995), IT Manager of the Year (2002), Brassai Award (2005), Award for the Hungarian IT (2005), Dennis Gabor Award (2010).
Josh Bongard
Robotics, bionics, bio-inspired processes, self-repair
Title: "How Evolution Shapes the Way Roboticists Think"
Abstract: Roboticists, by necessity, are keen students of biology: we hope to create machines that are as agile, adaptive and intelligent as the organisms we see around us. However, roboticists tend to copy the end products of evolution (compliant limbs, neural circuits, legged gaits) rather than evolutionary processes themselves (selection pressures, developmental programs). In this talk I will show how re-creating evolution in a computer can allow us to design robots automatically, rather than trying to build them manually.
Biography: Josh Bongard is an assistant professor at the University of Vermont. He is an NSF CAREER award recipient, a member of MIT Technology Review's 'Top 35 Innovators under the Age of 35', and a Microsoft Faculty Fellow. He received his Bachelors degree in Computer Science from McMaster University, Canada; his Masters degree in Evolutionary and Adpative Systems from the University of Sussex, UK; and his PhD from the University of Zurich, Switzerland. He also served as a postdoctoral associate in the Computational Synthesis Laboratory at Cornell University.
Morphology, Evolution and Cognition Laboratory
Department of Computer Science
University of Vermont
Vermont Advanced Computing Center
Jean-Philippe Bouchaud
Finance, economy, socio-technical systems
Title: "The endogenous dynamics of markets: price impact and feedback loops"
Abstract: We review the evidence that the erratic dynamics of markets is to a large extent of endogenous origin, i.e. determined by the trading activity itself and not due to the rational processing of exogenous news. In order to understand why and how prices move, the joint fluctuations of order flow and liquidity – and the way these impact prices – become the key ingredients. Impact is necessary for private information to be reflected in prices, but by the same token, random fluctuations in order flow necessarily contribute to the volatility of markets. Our thesis is that the latter contribution is in fact dominant, resulting in a decoupling between prices and fundamental values, at least on short to medium time scales. We argue that markets operate in a regime of vanishing revealed liquidity, but large latent liquidity, which would explain their hyper-sensitivity to fluctuations. More precisely, we identify a dangerous feedback loop between bid-ask spread and volatility that may lead to microliquidity crises and price jumps. We discuss several other unstable feedback loops that should be relevant to account for market crises: imitation, unwarranted quantitative models, pro-cyclical regulation, etc.
Biography: Jean-Philippe Bouchaud was born in France in 1962. After studying at the French Lycée of London, he graduated from the Ecole Normale Supérieure in Paris, where he also obtained his PhD in physics. He was then appointed by the CNRS until 1992. After a year spent in the Cavendish Laboratory (Cambridge), he joined the Service de Physique de l’Etat Condensé (CEA-Saclay), where he worked on the dynamics of glassy systems and on granular media.
He became interested in economics and theoretical finance in 1991. His work in finance includes extreme risk models, agent based simulations, market microstructure and price formation. He has been very critical about the standard concepts and models used in economics and in the financial industry (market efficiency, Black-Scholes models, etc.)
He founded the company Science & Finance in 1994 that merged with Capital Fund Management (CFM) in 2000. He is now the President and Head of Research at CFM, and professor at Ecole Polytechnique since 2008. He was awarded the IBM young scientist prize in 1990 and the C.N.R.S. Silver Medal in 1996. He has published over 250 scientific papers and several books in physics and in finance.
John Pendry
Optics and metamaterials
Title: "The Science of Invisibility"
Abstract: Refractive materials give limited control of light: we can fashion lenses, and construct waveguides, but complete control is beyond simple refracting materials. Ideally we might wish to channel and direct light as we please just as we might divert the flow of a fluid. Manipulation of Maxwell’s equation shows that we can achieve just that and metamaterials open the door to this new design paradigm for optics, providing the properties required to give complete control of light. One potential application would be to steer light around a hidden region, creating a cloak of invisibility.
Biography: John Pendry is a condensed matter theorist. He has worked at the Blackett Laboratory, Imperial College London, since 1981. He began his career in the Cavendish Laboratory, Cambridge, followed by six years at the Daresbury Laboratory where he headed the theoretical group. He has worked extensively on electronic and structural properties of surfaces developing the theory of low energy diffraction and of electronic surface states. Another interest is transport in disordered systems where he produced a complete theory of the statistics of transport in one dimensional systems.
In 1992 he turned his attention to photonic materials and developed some of the first computer codes capable of handling these novel materials. This interest led to his present research, the subject of his lecture, which concerns the remarkable electromagnetic properties of materials where the normal response to electromagnetic fields is reversed leading to negative values for the refractive index. This innocent description hides a wealth of fascinating complications. In collaboration with scientists at The Marconi Company he designed a series of ‘metamaterials’ whose properties owed more to their micro-structure than to the constituent materials. These made accessible completely novel materials with properties not found in nature. Successively metamaterials with negative electrical permittivity, then with negative magnetic permeability were designed and constructed.
These designs were subsequently the basis for the first material with a negative refractive index, a property predicted 40 years ago by a Russian scientist, but unrealised because of the absence of suitable materials. He went on to explore the surface excitations of the new negative materials and showed that these were part of the surface plasmon excitations familiar in metals. This project culminated in the proposal for a ‘perfect lens’ whose resolution is unlimited by wavelength. These concepts have stimulated further theoretical investigations and many experiments which have confirmed the predicted properties. The simplicity of the new concepts together with their radical consequences have caught the imagination of the world’s media generating much positive publicity for science in general.