Skip to main content

in this section

Future Current: ECE Research on the Cutting-Edge of Circuit Design

Wednesday, January 13, 2016

A circuit layout developed by Assistant Professor Christoph Studer synthesizes dual-core large-scale multiple-input-multiple-output detector. The layout characterizes the area, energy and timing of the new circuit architecture.

The endless objective of producing a more powerful and efficient microchip grows more challenging as electrical engineers reach new limits in key design features. At Cornell Engineering’s School of Electrical and Computer Engineering, faculty are looking at electronic circuits in a different way, not only to improve performance, but to create new devices that can make life more healthy, productive and enjoyable.

Analog Circuits

Electronic circuits have grown so powerful, miniature, and in many cases, affordable, that they are creating what is known as the Internet of Things—a network of “smart” devices that can communicate and exchange data with each other. Everything from mobile phones and televisions to light bulbs and coffee makers can be programmed wirelessly and collect data about usage and user preferences. In order for humans themselves to become part of this network, circuits must be able to sense analog information such as a heart beat, a step or a breath, and convert it into meaningful, digital data.

While devices like smart watches have been successful in wirelessly tracking our movements and even basic vital signs such as heart rate, Professor Alyssa Apsel is building circuits that can track something much more difficult: neurochemicals. The ability to monitor neurochemicals such as dopamine can give researchers new insights into social behaviors and the ability to regulate neurodegenerative diseases such as Parkinson’s.

A common neurochemical sensing technique in rats—fast-scan cyclic voltammetry—allows for real-time collection of neurodata, but requires biosensors that are tethered to a computer system that can translate and digitize the data. This restricts the animal’s movement and dictates when and where data can be acquired. The solution being developed by Apsel and Carlos Dorta-Quiñones Ph.D. ’14 is a low-cost, low-power chip that can be implanted in a rat’s brain and allows it to move freely while transmitting data wirelessly.

The challenge lies in the performance of the chip, which must be able to read dopamine levels while still having enough power to communicate the data to a remote system. In order to improve the range in which the circuit can transmit data, Apsel and Dorta-Quiñones developed a new compressive analog-to-digital converter that was able to ignore unwanted data and focus on the small amounts of dopamine being produced by the rat. “With this we could decrease the amount of data being sent wirelessly during neurochemical monitoring experiments. We paired this with our existing radio technology that let us build a full system that was about 10-times more power-efficient than previously published work and did the same thing,” said Apsel.

The implications are easy to imagine. One day, doctors may be able to monitor a patient’s neurochemical balance noninvasively and in real-time, without the patient ever leaving her home. “I think circuits will continue to blend with our environment and actually become less obvious in our daily life. Our devices will become better and better at adapting to us and our needs,” said Apsel, on the future of circuit design in general.

Similarly, Associate Professor Ehsan Afshari is thinking about analog circuit design in an entirely new way in order to more effectively monitor human health. Afshari has developed a method of generating terahertz signals on a silicon microchip with the goal of one day replacing large and expensive machines designed to produce X-ray and terahertz signals. Medical imaging devices, such as body scans that can detect skin cancer, may one day be a hand-held technology for consumers, and could even be embedded in smartphones.

The challenge of producing a terahertz signal on a silicon metal-oxide semiconductor—the common transistor found in many microprocessors and data converters—is that the transistor’s maximum oscillation is below the terahertz frequency band. In other words, the transistor can’t exchange electrons fast enough to generate a terahertz signal that can be tuned by the circuit.

To address this problem, Afshari has designed a circuit with multiple oscillators that can harmonize together and combine their power to create a higher-quality signal within a narrow band of the terahertz range. It’s a task that’s easier said than done, but using this method, Afshari and his team of researchers have managed to produce a 180-milliwatts signal at 0.32 terahertz and a 50-milliwatts signal at 0.34 terahertz, both “with the ability to change the frequency and steer the direction of the radiation,” said Afshari.

Using this research, the team has created the first fully-integrated, coherent terahertz imaging system. “That means the transmitter and receiver are phased-locked. In simple terms, we not only detect the intensity of the signal, but we have its phase information. This results in a much higher sensitivity compared to the incoherent systems before us,” said Afshari. “So not only do we have great sources and individual components, we now have a complete coherent imaging system in our lab.”

To advance the technology, Afshari is using the same design to construct chips from materials that can handle higher frequencies, such as gallium nitride. He was recently able to generate a world-record 1-milliwatt signal at 0.22 terahertz—the most powerful signal ever from this type of integrated system. Such materials are significantly more expensive than silicon, so Afshari is also experimenting with adding more oscillators to his silicon-based circuits.

Aside from medical imaging, microchips that can produce the right terahertz frequencies could also be used for mobile security scanning and wireless transfer of large amounts of data. “With this kind of technology, we can easily envision adding terahertz imaging and communication technology to mobile phones. This means everyone can scan for certain medical issues or find certain materials,” said Afshari.

Digital Circuits

The invention of the transistor in 1947 set off a digital revolution that has seen integrated circuits continue to grow exponentially more powerful. Transistors are one of the most important elements of circuits found in all electronic devices, controlling the flow of electricity and amplifying the current. When engineers wanted more performance, the common solution has been to leverage technology scaling by integrating faster transistors in greater numbers onto a single chip.

But as today’s transistors approach near-atomic sizes, that solution is beginning to reach its limitations. Christopher Batten, assistant professor, says the performance-energy tradeoff reached a breaking point for monolithic general-purpose processors around 2005. As a result, engineers like Batten have been developing new and creative methods of providing more performance for less energy.

One of those methods is parallel computing. “The idea is instead of making a single-core processor more and more complicated, what we’ll do is go back to a simpler core and integrate two onto a chip. Theoretically, you would get twice the performance at the same energy,” said Batten. But there hasn’t been an exponential growth in the amount of processors on chips. Even high-end servers typically have only 16 processors or less because it’s difficult for them to work cooperatively. “You can have 100 workers building a house,” Batten analogizes, “but you would have guys sitting around waiting or they don’t know what to do.”

So instead, Batten says to think about only 16 workers building that same house, but they work together to build small sections at a time. First the floors, then the walls, then the roof. Processors are now performing the same tasks on different data, in parallel. Add to that specialized accelerators that are specifically designed to speed the performance of each task, and you have a very powerful chip.

But to make parallel computing and specialized accelerators work, there are serious challenges, the most complex of which is where the hardware meets the software. Because the hardware is so specialized, programmers writing software code must learn how each element of the chip communicates with the rest to ensure the software’s compatibility.

Batten is working on creating new hardware-software abstractions—essentially mediators that help the software communicate with the hardware—that are clean and elegant, but still give some kind of specialization benefit.

Another researcher addressing these challenges is Christoph Studer, assistant professor, who possesses a rare mix of expertise in communication theory (methods of transmitting and decoding data) and integrated circuit design that allows him to think about the full design process and its complex symbiosis, all at once. “The engineers that just implement, they’re limited by the algorithms given to them. They cannot change the algorithms because they don’t understand the theory behind them,” said Studer, who adds that it can take years for the industry to pick up on new theoretical results. That’s opposed to the instantaneous circuit implementation of novel theoretical results that can occur in Cornell’s Computer Systems Laboratory.

“The key point is that the biggest saving, if you want to reduce power consumption, is by implementing the right algorithm,” said Studer. A common example is the Fourier transform—an algorithm first developed in the early 1800s as a method for analyzing the frequency contents of signals. In 1965, a much faster version of the algorithm was developed to perform the same task, and it revolutionized the STEM field. The fast Fourier transform, or FFT, is now used to transmit data wirelessly and to compress files to take up less space, among its many other applications.

Studer says efficient algorithms are becoming more important as future communication systems demand methods that require more battery power. “We’re looking at a 5G standard [for cellphones] in about five to 10 years. Unfortunately, the batteries won’t be significantly better in 10 years,” said Studer, who adds that the scaling of battery technology is much slower than that of integrated circuit technology.

Analog engineers like Apsel and Afshari, who rely on tiny, low-powered sensors and signal-processors to read and decode data from the human body, also rely on engineers like Studer, who can develop suitable algorithms that make sense of the data generated by their devices.

“Cornell is getting really good at [circuit design] because we cover the full spectrum,” said Studer, who says the research being done at the School of Electric and Computer Engineering provides a lot of potential for startup companies to form, especially with the emergence of Cornell Tech.

Batten says chip startups have great potential to contribute innovations to the circuit industry because they don’t have the luxury of working off years of built-up intellectual capital. Instead, they rely on exploring new concepts, such as field-programmable gate arrays—circuits made of reconfigurable fabrics that can be manipulated by consumers after being manufactured.
“It’s an exciting time to get into the field because industry seems open to try new things to address these challenges. Everyone is looking for ways to deal with and manage this complexity,” said Batten.
—Syl Kacapyr

back to listing