3D printing allows for distributed manufacturing—meaning products can be created on demand in a facility nearby. In the near future, this will allow consumers to purchase goods which fit their very specific needs. It will also have these goods printed and shipped in a matter of hours, as opposed to the weeks it can take to receive a custom item.
TOPIC FOR TECHNOLOGY
Sunday, June 7, 2015
The Possibilities of 3D Printing: It’s Only the Beginning
3D printing allows for distributed manufacturing—meaning products can be created on demand in a facility nearby. In the near future, this will allow consumers to purchase goods which fit their very specific needs. It will also have these goods printed and shipped in a matter of hours, as opposed to the weeks it can take to receive a custom item.
Tool measures the distance between phonon collisions
Today’s computer chips pack billions of tiny transistors onto a plate of silicon within the width of a fingernail. Each transistor, just tens of nanometers wide, acts as a switch that, in concert with others, carries out a computer’s computations. As dense forests of transistors signal back and forth, they give off heat—which can fry the electronics, if a chip gets too hot.
Manufacturers commonly apply a classical diffusion theory to gauge a transistor’s temperature rise in a computer chip. But now an experiment by Massachusetts Institute of Technology (MIT) engineers suggests that this common theory doesn’t hold up at extremely small length scales. The group’s results indicate that the diffusion theory underestimates the temperature rise of nanoscale heat sources, such as a computer chip’s transistors. Such a miscalculation could affect the reliability and performance of chips and other microelectronic devices.
Manufacturers commonly apply a classical diffusion theory to gauge a transistor’s temperature rise in a computer chip. But now an experiment by Massachusetts Institute of Technology (MIT) engineers suggests that this common theory doesn’t hold up at extremely small length scales. The group’s results indicate that the diffusion theory underestimates the temperature rise of nanoscale heat sources, such as a computer chip’s transistors. Such a miscalculation could affect the reliability and performance of chips and other microelectronic devices.
Entangled photons unlock super-sensitive characterization of quantum tech
A new protocol for estimating unknown optical processes, called unitary operations, with precision enhanced by the unique properties of quantum mechanics has been demonstrated by scientists and engineers from the Univ. of Bristol and the Centre for Quantum Technologies in Singapore.
The work, published in Optica, could lead to both dramatically better sensors for medical research and new approaches to benchmark the performance of ultra-powerful quantum computers.
History tells us the ability to measure parameters and sense phenomena with increasing precision leads to dramatic advances in identifying new phenomena in science and improving the performance of technology: famous examples include x-ray imaging, magnetic resonance imaging (MRI), interferometry and the scanning-tunneling microscope.
The work, published in Optica, could lead to both dramatically better sensors for medical research and new approaches to benchmark the performance of ultra-powerful quantum computers.
History tells us the ability to measure parameters and sense phenomena with increasing precision leads to dramatic advances in identifying new phenomena in science and improving the performance of technology: famous examples include x-ray imaging, magnetic resonance imaging (MRI), interferometry and the scanning-tunneling microscope.
A foundation for quantum computing
Quantum computers are in theory capable of simulating the interactions of molecules at a level of detail far beyond the capabilities of even the largest supercomputers today. Such simulations could revolutionize chemistry, biology and materials science, but the development of quantum computers has been limited by the ability to increase the number of quantum bits, or qubits, that encode, store and access large amounts of data.
In a paper published in the Journal of Applied Physics, a team of researchers at the Georgia Tech Research Institute (GTRI) and Honeywell International have demonstrated a new device that allows more electrodes to be placed on a chip—an important step that could help increase qubit densities and bring us one step closer to a quantum computer that can simulate molecules or perform other algorithms of interest.
In a paper published in the Journal of Applied Physics, a team of researchers at the Georgia Tech Research Institute (GTRI) and Honeywell International have demonstrated a new device that allows more electrodes to be placed on a chip—an important step that could help increase qubit densities and bring us one step closer to a quantum computer that can simulate molecules or perform other algorithms of interest.
Cooling the cloud
The industry has been shifting from open-air cooling of these facilities to increasingly complex systems that segregate hot air from cold air. When it comes to cost savings, there are definite advantages to the aisle containment systems, which have been estimated to save 30% of cooling energy—but it's not yet clear how they increase the risk of overheating, or how to design them for greatest safety and optimum energy efficiency.
Advance in quantum error correction
Quantum computers are largely theoretical devices that could perform some computations exponentially faster than conventional computers can. Crucial to most designs for quantum computers is quantum error correction, which helps preserve the fragile quantum states on which quantum computation depends.
The ideal quantum error correction code would correct any errors in quantum data, and it would require measurement of only a few quantum bits, or qubits, at a time. But until now, codes that could make do with limited measurements could correct only a limited number of errors—one roughly equal to the square root of the total number of qubits. So they could correct eight errors in a 64-qubit quantum computer, for instance, but not 10.
The ideal quantum error correction code would correct any errors in quantum data, and it would require measurement of only a few quantum bits, or qubits, at a time. But until now, codes that could make do with limited measurements could correct only a limited number of errors—one roughly equal to the square root of the total number of qubits. So they could correct eight errors in a 64-qubit quantum computer, for instance, but not 10.
Breakthrough heralds super-efficient light-based computers
Stanford Univ. electrical engineer Jelena Vuckovic wants to make computers faster and more efficient by reinventing how they send data back and forth between chips, where the work is done.
In computers today, data is pushed through wires as a stream of electrons. That takes a lot of power, which helps explain why laptops get so warm.
"Several years ago, my colleague David Miller carefully analyzed power consumption in computers, and the results were striking," said Vuckovic, referring to electrical engineering Prof. David Miller. "Up to 80% of the microprocessor power is consumed by sending data over the wires—so called interconnects."
In computers today, data is pushed through wires as a stream of electrons. That takes a lot of power, which helps explain why laptops get so warm.
"Several years ago, my colleague David Miller carefully analyzed power consumption in computers, and the results were striking," said Vuckovic, referring to electrical engineering Prof. David Miller. "Up to 80% of the microprocessor power is consumed by sending data over the wires—so called interconnects."
Maximizing the Value of Scientific Literature
For years manual curation of scientific publications has been the gold standard, with technology-based solutions ranking far behind in terms of accuracy and completeness. Today, that’s no longer the case. Versatile, well-designed and well-tested applications, combined with significantly enhanced computational power, are elevating automated curation to a more equivalent position. Proprietary text-mining technologies now rival manual curation for some types of search needs as a means of ensuring researchers aren’t missing out on valuable information.
Subscribe to:
Posts (Atom)