• The Computing and Information Science and Engineering Landscape: A Look Forward

    The United States National Science Foundation (NSF) supports a majority of US academic research in the Computer and Information Science and Engineering (CISE) topic areas.  A long-time computing researcher herself, Dr. Margaret Martonosi is now serving a 4-year term leading the NSF CISE Directorate, and stewarding the CISE directorate’s $1B+ annual budget on behalf of research, education, workforce and infrastructure funding in CISE topic areas and for science as a whole.  In this talk, she will discuss key themes for the field, how CISE is developing programmatic opportunities to advance research related to them, and also how CISE invests in cross-cutting people issues for the field as well.  Martonosi is conducting a series of virtual and in-person campus visits to engage in conversation about a vision for CISE research going forward, and to field Q&A from the CISE community.  Please join us for this highly-interactive session and please bring your input and questions! 

  • Provably Beneficial Artificial Intelligence

    As AI advances in capabilities and moves into the real world, its potential to benefit humanity seems limitless. Yet we see serious problems including racial and gender bias, manipulation by social media, and an arms race in lethal autonomous weapons.  Looking further ahead, Alan Turing predicted the eventual loss of human control over machines that exceed human capabilities. I will argue that Turing was right to express concern but wrong to think that doom is inevitable. Instead, we need to develop a new kind of AI that is provably beneficial to humans.

  • Not a Science Project!

    Many of us academics are now crossing what appeared to be the “valley of death” to the industry. The industry wants state of the art tech to differentiate in the market and researchers want to contribute something meaningful to the industry to broaden their horizon (as well as generate extra income). As the academic and industry boundaries continue to blur, we are at a pivotal moment in history where understanding where and how to implement the cutting edge technology is becoming potentially even more crucial than the technology advancement itself. This is a way of thinking that was neglected until recently, but the cure, the true human (or user) centric product creation technique, is what needs to be implemented as we move forward. From creating predictive & preventable healthcare at home, to tech-enabled human services, to running cutting-edge technologies in the background of consumer devices, I will reflect from my own experiences on what it takes to bring meaningful innovation to the consumer market, and the endless possibilities the future holds, if we do this right.

  • Brain-Computer Interfaces: Past, Present, and Future

    Our brain uses electrical signals for communication, and by sensing and modulating these electrical activities, we can create direct connections between the human brain and the external world. The potential for brain-computer interfaces (BCIs) is vast and varied, with therapeutic and future applications that have yet to be realized. From its early beginnings to the latest breakthroughs, this talk will provide a comprehensive overview of the exciting possibilities of BCI technology. I will discuss the current state of BCI technology, highlighting recent progress made in both academia and industry, as well as remaining challenges that must be overcome to fully realize its potential.

  • Automating Chip Design with Artificial Intelligence – a new golden era of creating chips?

    Artificial Intelligence is an avenue to innovation that is touching every industry worldwide. AI has made rapid advances in areas like speech and image recognition, gaming, and even self-driving cars, essentially automating less complex human tasks. In turn, this demand drives rapid growth across the semiconductor industry with new chip architectures emerging to deliver the specialized processing needed for the huge breadth of AI applications. Given the advances made to automate simple human tasks, can AI solve more complex tasks such as designing a computer chip? In this keynote, we will discuss the challenges and opportunities of building advanced chip designs with the help of artificial intelligence, enabling higher performance, faster time to market, and utilizing reuse of machine-generated learning for successive products.

  • A vision for a semiconductor quantum processor – hot, dense and coherent

    Quantum computation has captivated the minds of many for almost two decades. For much of that time, it was seen mostly as an extremely interesting scientific problem. In the last few years, we have entered a new phase as the belief has grown that a large-scale quantum computer can actually be built. Quantum bits encoded in the spin state of individual electrons in silicon quantum dot arrays, have emerged as a highly promising direction. In this talk, I will present our vision of a large-scale spin-based quantum processor with integrated on-chip classical electronics, and ongoing work to realize this vision. We have achieved two-qubit operations with errors below 0.4% and universal control of up to six qubits. In close collaboration with our engineering colleagues and Intel, we have implemented qubit control using a cryogenic control chip, which will help overcome the wiring bottleneck. We have also performed preliminary experiments with switched capacitor circuits integrated on the qubit chip. Finally, given that the electronic circuits will dissipate, we have tested the impact of higher operating temperatures, up to 1 Kelvin, on the qubit performance. When combined, the progress along these various fronts can lead the way to scalable systems of high-fidelity spin qubit registers for computation and simulation.