Thursday, February 18, 2016

Cheap paper skin mimics the real thing

    Human skin’s natural ability to feel sensations such as touch and temperature difference is not easily replicated with artificial materials in the research lab. That challenge did not stop a Saudi Arabian research team from using cheap household items to make a “paper skin” that mimics many sensory functions of human skin. The artificial skin may represent the first single sensing platform capable of simultaneously measuring pressure, touch, proximity, temperature, humidity, flow, and pH levels. Previously, researchers have tried using exotic materials such as carbon nanotubes or silver nanoparticles to create sensors capable of measuring just a few of those things. By comparison, the team at King Abdullah University of Science and Technology (KAUST) in Saudi Arabia used common off-the-shelf materials such as paper sticky notes, sponges, napkins and aluminum foil. Total material cost for a paper skin patch 6.5 centimeters on each side came to just $1.67.
        "Its impact is beyond low cost: simplicity," says Muhammad Mustafa Hussain, an electrical engineer at KAUST. “My vision is to make electronics simple to understand and easy to assemble so that ordinary people can participate in innovation.” The paper skin’s low cost and wide array of capabilities could have a huge impact on many technologies. Flexible and wearable electronics for monitoring human health and fitness could become both cheaper and more widely available. New human-computer interfaces—similar to today’s motion-sensing or touchpad devices—could emerge based on the paper skin’s ability to sense pressure, touch, heat, and motion. The paper skin could also become a cheap sensor for monitoring food quality or outdoor environments.
        Last but not least, cheap artificial skin could give robots the capability to feel their environment in the same way that humans do, Hussain says. In a paper detailing the research—published in the 19 February issue of the journal Advanced Materials Technologies—the researchers said:
       "The envisioned applications of such artificial skin takes a lot of surface area coverage (like                 robotic skins or skins for robots). There, lowering cost is crucial while not compromising                     performance. In that sense, if mechanical ruggedness can be proved, there is no scientific or                 technical reason for not accepting paper skin as a viable option."
The team’s low-cost approach often seems as simple as a classroom experiment. As an example, researchers built a pressure sensor by sandwiching a napkin or sponge between two metal contacts made from aluminum foil. The same simple device could also detect touch and flow based on changes in pressure. Its aluminum foil even allowed it to act as a proximity sensor for electromagnetic fields with a detection range of 13 centimeters.

431

Sunday, February 14, 2016

Intel's Core-M Processor - what you need to know

        Intel's new Core-M processor is at the heart of a laptop revolution - and a new wave of fanless computers. Running at only 4.5 watts, compared to the 11.5 watts of an i5 processor or 57 watts of a quad-core i7, this processor doesn't require a fan cooled heat sink. With its low power consumption and low heat generation, manufacturers can build laptops that are thinner than we’ve ever seen before. Going fanless allows manufacturers to build thinner devices that make less noise. For example, the latest MacBook half an inch thick; the second-generation Lenovo ThinkPad Helix is just .38 inches thick, compared to its .46-inch, Core i5-powered predecessor. Where the original Helix's keyboard dock had a hinge with dual fans built-in, the new Ultrabook Keyboard doesn't even have one fan. Last year's model lasted just 5 hours and 48 minutes when detached from its dock, but the Lenovo promises 8 hours of endurance from the Core M-powered Helix.
       Though Core M is the first processor based on Intel's new 14nm Broadwell architecture, it certainly won't be the last. In 2015, Intel plans to use Broadwell as the basis of its next-generation Core i3 / i5 / i7 chips for both laptops and desktops. Over the past few years, Intel has released a new processor architecture on an annual basis, with a die shrink occurring every other generation. The current Haswell and prior-gen Ivy Bridge architecture use a 22nm process while Broadwell will be the first to use 14nm. A smaller die size means that Intel can fit more transistors into a smaller space, using less power, generating less heat and taking up less space in the chassis. The Core M processor package eats up just 495 square millimeters of space, about half the size of the 960 square-millimeter 4th Generation Core Series package.

301

Saturday, February 13, 2016

Fusion - Germany's Wendelstein 7-X

       Yesterday, one of the grandest experimental fusion reactors in the world flared to life, converting hydrogen into a plasma for less than a second. The honor of pressing the button went to a PhD in Quantum Chemistry (who also happens to be the Chancellor of Germany), Angela Merkel.
Why such a high-profile ribbon-cutting? Fusion power is a kind of nuclear power source, the same thing that happens on a much larger scale in the hearts of stars. Theoretically, if you could get light atoms to fuse into heavier atoms, the energy produced by the reaction (which happens at immense temperatures and pressures) would be a clean source of energy that could continue almost indefinitely, without the radiation byproducts of nuclear fission (the method currently employed at nuclear power plants).
        The German experiment, called Wendelstein 7-X, received funding or components from Germany, Poland, and the United States. This is the first run with hydrogen, though it did some initial work creating helium plasma last year. Though the hydrogen plasma was short-lived, it was an exciting moment for researchers. “With a temperature of 80 million degrees Celsius and a lifetime of a quarter of a second, the device’s first hydrogen plasma has completely lived up to our expectations”, Hans-Stephan Bosch, head of operations for Wendelstein 7-X said. The Wendelstein 7-X is not designed to produce energy. Instead, the experiment is focused solely on producing and maintaining a levitating ball of super-heated plasma, which is a key step towards fusion energy.
The Germans aren't the only ones working on fusion, though. In France, the largest fusion reactor ever made, called ITER, is under construction. Private companies are in on the race too, with Lockheed Martin also working on a fusion reactor design.
        While they are both meant to achieve a similar goal, there are differences among the designs used by the various groups. One of the more popular designs is the tokamak, a Russian design used by ITER. It uses a doughnut-shaped machine to generate a magnetic field to contain the hot plasma. The Wendelstein 7-X, on the other hand, is a stellarator. While it is also a doughnut shape, it has the distinct advantage of theoretically being able to run indefinitely, instead of in pulses like tokamaks. If the Wendelstein 7-X succeeds in producing plasma for long amounts of time, (they hope to get up to 30 minutes by 2025 if not earlier) then it might show that the stellarator design could be used in future fusion power plants.

417

Thursday, February 11, 2016

Breaking News: Gravitational Waves - never seen before - "the breakthrough of the century"

        Today, scientific history was made. At 3:30pm today, the National Science Foundation had a press conference to "update the scientific community on efforts to detect” gravitational waves. They reported that, for the first time, scientists have observed these gravatational ripples in the fabric of spacetime, arriving at the earth from a cataclysmic event in the distant universe. This discovery confirms a major prediction of Albert Einstein’s 1915 general theory of relativity. It also opens up an unprecedented new view on the cosmos.
        We have been able to see the universe ever since the first human looked upwards to the skies. With the advent of the telescope in 1610 the process of using telescope to extend our sense of sight ever further into the Universe. Gravitational waves are different however. They are not at all like light or any of the other "electromagnetic radiations" such as radio waves, X-rays, infrared, ultraviolet rays. Instead, they are ‘ripples’ in the fabric of the universe (space and time itself!). These ripples could be interpreted as sounds, which are essentially oscillating waves (ripples) in the air. Researchers can even turn the simulated gravitational wave signals into audible sounds. Those sounds could then be translated into a piece of music, like a "gravitational wave symphony". The simulated gravitational wave signals are the oscillating sounds that increase in frequency until they abruptly stop with a ‘chirp’. These are the tell-tale signals for which gravitational wave astronomers search.
        The LIGO project ( Laser Interferometer Gravitational-Wave Observatory), which is searching for, and has found, gravitational waves, was co-established in Livingston, Louisiana in 1992 by MIT, Caltech, and numerous other universities. The National Science Foundation funds the project and gets contributions from other international scientific groups. It didn't detect anything from 2002 to 2010, and after a 5-year shutdown to upgrade to detectors, it came online in the fall of 2015 with four times the sensitivity than before the upgrade.
        As I said before, gravitational waves were predicted by Albert Einstein back in 1915 on the basis of his general theory of relativity. As they are also not possible in the Newtonian theory of gravitation (because that theory states that physical interactions propagate at infinite speed), this discovery disproves part of that theorem.
        Gravitational waves carry information about their dramatic origins and about the nature of gravity that cannot otherwise be obtained. Physicists have concluded that the detected gravitational waves were produced during the final fraction of a second of the merger of two black holes to produce a single, more massive spinning black hole. This collision of two black holes had been predicted but never observed. However, scientists at LIGO estimate that black holes involved in the event were 29 and 36 times the mass of the sun, and that the event took place 1.3 billion years ago. During the combination, forces equal to about 3 times the mass of the sun were converted in a fraction of a second into gravitational waves—with a peak power output about 50 times that of the whole visible universe. This all happens because (according to general relativity), as a pair of black holes orbit around each other, they lose energy through the emission of gravitational waves. That causes them to gradually approach each other over billions of years, and then much more quickly in the final minutes. During the final fraction of a second, the two black holes collide into each other at nearly one-half the speed of light and form a single more massive black hole, converting a portion of the combined black holes’ mass to energy, according to Einstein’s formula E=mc2. This energy is emitted as a final strong burst of gravitational waves. It is these gravitational waves that LIGO has observed. By looking at the time of arrival of the signals—the detector in Livingston recorded the event 7 milliseconds before the detector in Hanford—scientists can say that the source was located in the Southern Hemisphere.
        The ability to detect gravitational waves will not just be bring us new views of objects that we can already see, but will allow us to detect and study objects and events that are currently completely hidden from view. It also means that after years of research, hard work, and technological innovations, Einstein's theory is finally finally proven.



For more information see these links:
https://en.wikipedia.org/wiki/Gravitational_wave
https://en.wikipedia.org/wiki/LIGO
http://www.ligo.org/
https://www.ligo.caltech.edu/
http://www.nytimes.com/2016/02/12/science/ligo-gravitational-waves-black-holes-einstein.html?_r=0
http://www.nytimes.com/2015/11/24/science/a-century-ago-einsteins-theory-of-relativity-changed-everything.html
http://www.nasa.gov/feature/goddard/2016/nsf-s-ligo-has-detected-gravitational-waves
https://www.theguardian.com/science/2016/feb/11/gravitational-waves-discovery-hailed-as-breakthrough-of-the-century
https://www.theguardian.com/science/across-the-universe/live/2016/feb/11/gravitational-wave-announcement-latest-physics-einstein-ligo-black-holes-live
https://www.theguardian.com/science/2016/feb/09/gravitational-waves-everything-you-need-to-know
https://www.theguardian.com/science/across-the-universe/2016/feb/11/todays-gravitational-wave-announcement-could-be-two-great-discoveries-in-one

713

Saturday, February 6, 2016

Super-large space telescopes

        Throughout history, there have been four things that have determined just how much information we can glean about the Universe through astronomy:

1. The size of your telescope, which determines both how much light you can gather in a given amount of time and also your resolution.
2. The quality of your optical systems and cameras/CCDs, which allow you to maximize the amount of light that becomes usable data.
3. The “seeing” through the telescope, which can be distorted by the atmosphere but minimized by high altitudes, still air, cloudless nights and adaptive optics technology.
4. And your techniques of data analysis, which can ideally make the most of every single photon of light that comes through.

        There have been tremendous advances in ground-based astronomy over the past 25 years, but they’ve occurred almost exclusively through improvements of numbers 2 and 4. The largest telescope in the world in 1990 was the Keck 10-meter telescope, and while there are a number of 8-to-10 meter class telescopes today, 10 meters is still the largest class of telescopes in existence. Moreover, we’ve really reached the limits of what improvements in those areas can achieve without going to larger apertures. This isn’t intended to minimize the gains in these other areas; they’ve been tremendous. But it’s important to realize how far we’ve come. The charge-coupled devices (CCDs) that are mounted to telescopes can focus on either wide-field or very narrow areas of the sky, gathering all the photons in a particular band over the entire field-of-view or performing spectroscopy — breaking up the light into its individual wavelengths — for up to hundreds of objects at once. We can cram more megapixels into a given surface area. Quite simply, we’re at the point where practically every photon that comes in through a telescope’s mirror of the right wavelength can be utilized, and where we can observe for longer and longer periods of time to go deeper and deeper into the Universe if we have to.

        And finally, computational power and data analysis technique have improved tremendously, where more useful information can be recorded and extracted from the same data that we can take. These are tremendous advances, but just like a generation ago, we’re still using the same size telescopes. If we want to go deeper into the Universe, to higher resolution, and to greater sensitivities, we have to go to larger apertures: we need a bigger telescope. There are currently three major projects that are competing to be first: the Thirty-Meter Telescope atop Mauna Kea, the (39 meter) European Extremely Large Telescope in Chile, and the (25 meter) Giant Magellan Telescope (GMT), also in Chile. These represent the next giant leap forward in ground based astronomy, and the Giant Magellan Telescope is probably going to be first, having broken ground at the end of last year and with early operations planned to begin in just 2021, and becoming fully operational by 2025.


491

Friday, February 5, 2016

Concussions, A Disastrous Physical and Biological Hybrid, and what we're doing to stop them

        During Sunday’s Super Bowl, the showdown between the Denver Broncos and the Carolina Panthers, the competition is sure be fierce. There will be an estimated 130-plus plays consisting of hundreds of hits, tackles, spears, and lay outs. That translates into at least minor head trauma for most players.
        In 2007, three former NFL players committed suicide. Andre Waters, 44, shot himself in the head after bouts with depression. Terry Long drank a bottle of antifreeze at 45. Thirty-six-year-old Justin Strzelczyk heard voices and died in a crash while fleeing police. Every autopsy was performed by physician Bennet Omalu, and every one had signs of brain damage. Omalu, a former neuropathologist at the University of Pittsburgh School of Medicine, was the first to pinpoint forensic evidence of football-induced chronic traumatic encephalopathy (CTE), a variation of "boxer's dementia." Symptoms include confusion, mood disorders, slurred speech and memory loss. This caused heavy pressure on the NFL to address the problem. A new committee was formed, called the NFL Head, Neck, and Spine committee. This new committee gave an actual neurological insight into these traumatic injuries. Evan Breedlove, a mechanical engineering graduate student at Purdue University who studies neurotrauma in Indiana high-school football players, says that hitting a player in the head is like shaking a Jell-o mold on a platter. The brain shakes, and little splits called microhemorrhages can form. The splits also allow fluid in, which increases the likelihood of further concussions. "It's generally a bad thing when the brain gets exposed to the chemistry in the rest of your body," Breedlove says. The average NFL player sustains as many as 1,500 hits to the head throughout a season. It's the accumulation of impact after impact that does real damage. "The big hit may just be the straw that breaks the camel's back," he says.  
        According to the NFL, there were 271 documented game-related concussions this past season — the most recorded by the league since 2011. Roughly one-third of those were caused by helmet-to-helmet contact. One of the worst of those hits occurred in January, during a grinding back-and-forth playoff match between Cincinnati Bengals and the Pittsburgh Steelers, a game generally regarded as one of the season’s dirtiest. The number of concussions in the NFL has increased by at least 20 percent each season for the past three years. The rate of concussions among high-school and college players (where they go unreported) is probably much higher.
        Many companies, most distinctly Riddell, have developed new helmets to help prevent concussions. During the 2011 season, NFL officials introduced "smart helmets" and mouthguards outfitted with accelerometers and radio-frequency identification to measure the location and direction of hits experienced during a game or practice. The data is wirelessly transmitted to a computer on the sidelines, which calculates the magnitude of the hit and the location of the blow. In a two-year pilot program with high-school and college players, the system gathered data on more than 1.5 million head impacts. Researchers at Riddell, the helmet's manufacturer, set a concussion threshold of 98 Gs per game—more than that, and they recommend benching the player. Each helmet costs about $1,000. A less-expensive helmet from Riddell, prepared for the 2012 season for high-school and college players, used a thin film to gauge hits. On impact, pressure compresses the film, which generates a charge, sending a signal to a handheld unit on the sideline. In 2010, Kevin Guskiewicz of the NFL's Head, Neck and Spine Committee said he would like to see "every player geared up with some kind of monitoring".

591

Thursday, February 4, 2016

Nanites: The Future of Tech

       In 1959, renowned physicist Richard Feynman talked about what was soon to be called nanotechnology. But it wasn't until 1986 that the ideas of atomic-sized structures and manipulation of things at that level surfaced again, this time in a book, Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale "assembler" which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Since then, nanotechnology has been nothing more than a sci-fi fantasy, where they could do amazing things such as heal wounds, build massive structures in time frames impossible for humans, and cure diseases from the body. That has been the case until 2005, when Rice University was able to build a "nanocar" out of fullerene molecules built on a chassis of carbon compounds. The precision required to this was great, as this car was 20000 times thinner than a human hair.
        However, the invention of actual working nanites would require a surge in either processing power or miniaturization, because to build a robot on a nano scale would require very tiny electronics to allow the robot to work. Not to mention weather-proofing the robot for missions inside the body and in adverse climates.
        Only time will tell.

213