Datasets:
BAAI
/

Modalities:
Text
Formats:
json
Libraries:
Datasets
Dask
License:
id
stringlengths
30
34
text
stringlengths
0
71.3k
industry_type
stringclasses
1 value
2016-40/3982/en_head.json.gz/2487
Understanding Mechanical Properties Of Silicon Nanowires Silicon nanowires are attracting significant attention from the electronics industry due to the drive for ever-smaller electronic devices, from cell phones to computers. The operation of these future devices, and a wide array of additional applications, will depend on the mechanical properties of these nanowires. New research from North Carolina State University shows that silicon nanowires are far more resilient than their larger counterparts, a finding that could pave the way for smaller, sturdier nanoelectronics, nanosensors, light-emitting diodes and other applications. It is no surprise that the mechanical properties of silicon nanowires are different from "bulk" "“ or regular size "“ silicon materials, because as the diameter of the wires decrease, there is an increasing surface-to-volume ratio. Unfortunately, experimental results reported in the literature on the properties of silicon nanowires have reported conflicting results. So the NC State researchers set out to quantify the elastic and fracture properties of the material. "The mainstream semiconductor industry is built on silicon," says Dr. Yong Zhu, assistant professor of mechanical engineering at NC State and lead researcher on this project. "These wires are the building blocks for future nanoelectronics." For this study, researchers set out to determine how much abuse these silicon nanowires can take. How do they deform "“ meaning how much can you stretch or warp the material before it breaks? And how much force can they withstand before they fracture or crack? The researchers focused on nanowires made using the vapor-liquid-solid synthesis process, which is a common way of producing silicon nanowires. Zhu and his team measured the nanowire properties using in-situ tensile testing inside scanning electron microscopy. A nanomanipulator was used as the actuator and a micro cantilever used as the load sensor. "Our experimental method is direct but simple," says Qingquan Qin, a Ph.D. student at NC State and co-author of the paper. "This method offers real-time observation of nanowire deformation and fracture, while simultaneously providing quantitative stress and strain data. The method is very efficient, so a large number of specimens can be tested within a reasonable period of time." As it turns out, silicon nanowires deform in a very different way from bulk silicon. "Bulk silicon is very brittle and has limited deformability, meaning that it cannot be stretched or warped very much without breaking." says Feng Xu, a Ph.D. student at NC state and co-author of the paper, "But the silicon nanowires are more resilient, and can sustain much larger deformation. Other properties of silicon nanowires include increasing fracture strength and decreasing elastic modulus as the nanowire gets smaller and smaller." The fact that silicon nanowires have more deformability and strength is a big deal. "These properties are essential to the design and reliability of novel silicon nanodevices," Zhu says. "The insights gained from this study not only advance fundamental understanding about size effects on mechanical properties of nanostructures, but also give designers more options in designing nanodevices ranging from nanosensors to nanoelectronics to nanostructured solar cells." The study, "Mechanical Properties of Vapor-Liquid-Solid Synthesized Silicon Nanowires," was co-authored by Zhu, Xu, Qin, University of Michigan (UM) researcher Wei Lu and UM Ph.D. student Wayne Fung. The study is published in the Nov. 11 issue of Nano Letters, and was funded by grants from the National Science Foundation and NC State. --- Image Caption: These are silicon nanowires used in the in-situ scanning electron microscopy mechanical testing by Dr. Yong Zhu and his team. Credit: North Carolina State University
科技
2016-40/3982/en_head.json.gz/2491
Print Email Font ResizeT. rex not a stand-up guy? Test your dino skillsBy MALCOLM RITTERPosted: 02/25/2013 03:00:00 AM ESTMonday February 25, 2013 NEW YORK -- Here’s a test of your dinosaur knowledge: Did Tyrannosaurus rex stand upright, with its tail on the ground? The answer: No. But a lot of young people seem to think so, and the authors of a study are blaming toys like Barney and other pop influences for that misconception. Scientists used to think T. rex stood tall, but they abandoned that idea decades ago. Now, the ferocious dinosaur is depicted in a bird-like posture, tail in the air and head pitched forward of its two massive legs. The change led major museums to update their T. rex displays, study authors said, and popular books have largely gotten the posture right since around 1990. So did the "Jurassic Park" movies. But when the researchers asked college students and children to draw a T. rex, most gave it an upright posture instead. Why? They’d soaked up the wrong idea from toys like Barney, games and other pop culture items, the researchers conclude. "It doesn’t matter what they see in science books or even in ‘Jurassic Park,"’ says Warren Allmon, a paleontology professor at Cornell University in Ithaca, N.Y., and an author of the study. It struck him when he saw a box of dinosaur chicken nuggets at a grocery store. "What they grew up with on their pajamas and their macaroni and wallpaper and everything else is the tail-dragging posture," he said.Advertisement If the explanation is correct, Allmon said, it’s a sobering reminder of how people can get wrong ideas about science. The study will be published in the Journal of Geoscience Education. The authors examined 316 T. rex drawings made by students at Ithaca College and children who visited an Ithaca museum. Most of the college students weren’t science majors. Seventy-two percent of the college students and 63 percent of the children drew T. rex as being too upright. Because the sample isn’t representative of the general population, the results don’t necessarily apply to young people in general. When the authors looked at other depictions of T. rex, they found the obsolete standing posture remains in pop culture items like toys, games, cookie cutters, clothing, comics and movies. Mark Norell, a prominent paleontologist at the American Museum of Natural History in New York who didn’t participate in the study, said he doesn’t know if the upright-posture myth is as widespread as the new study indicates. But he said it makes sense that children’s first impressions of T. rex can persist. If they don’t study dinosaurs later, "that’s what they’re stuck with."Print Email Font ResizeReturn to Top TALK TO US: If you'd like to leave a comment (or a tip or a question) about this story with the editors, please email us [email protected]. We also welcome letters to the editor for publication; you can do that by filling out our letters form and sending it to the newsroom.(Photo by Richard Shotwell/Invision/AP, File)Taylor Swift and Tom Hiddleston split upTaylor Swift decides Tom Hiddleston isn’t the one after all and hands him his walking papers. Bruce Springsteen has dealt with depression for more than 30 yearsGreta Van Susteren leaves Fox NewsGreen Day's theater tour coming to Berkeley next month
科技
2016-40/3982/en_head.json.gz/2559
Advertisement Advertisement Resilient Electric Grid Project: Keeping the U.S. Electrical Grid Online Fri, 08/28/2009 - 11:58am Comments by Resilient Electric Grid Project: Keeping the U.S. Electrical Grid Online With the new superconducting cable, Manhattan's electrical workers may be able to eventually clear out the aging, subterranean rats' nest beneath Wall Street that, amazingly, looks much the same today as it did a century ago (1913 image). Barring the occasional thunderstorm, most Americans take the electric current behind their power buttons for granted, and assume the juice will be there when they're ready to fire up an appliance or favorite tech toy. Little do most know, the strain on our electric grid — which has led to rolling brownouts and the massive 2003 blackout that left 40 million people across the Northeast in the dark — will only intensify in coming years. According to the Department of Energy, the annual cost of power outages is approximately $80 billion. Now add to conventional challenges those risks posed by terrorists intent on crippling our economy. Suddenly, the aim of electrical engineers to develop a technology to keep the country's electrical grid online (and recover faster) really begins to resonate. The Science and Technology Directorate (S&T) of the U.S. Department of Homeland Security is currently funding a promising solution — a superconductor cable that would link electrical substations and allow the sharing of excess capacity during emergencies. This generally is not done now, and so a flexibility like this strengthens the resiliency of the overall grid, reducing the likelihood of major power failures. This is S&T's Resilient Electric Grid project, and the superconducting cable is called an inherently fault current limiting (IFCL) superconductor cable. A single superconducting cable (shown in blue) could one day replace a dozen traditional copper cables (shown in red), freeing up much-needed space beneath city streets. Courtesy of US Department of Homeland Security - Science and Technology Engineers are putting decades of existing electrical research by industry electricity leaders from American Superconductor, Southwire and Consolidated Edison into practice, as they eye the aging rats' nest of power cabling under the crowded streets of New York City. S&T managers and scientists recently participated in a successful test of the new superconducting technology at Oak Ridge National Laboratory. The benefits are simple but profound: these cables can deliver more power, prevent power failures, and take up less physical space. A single superconductor cable can replace 12 copper cable bundles, freeing up more space underground for other utility needs such as water, natural gas or phone service. The technology is capable of carrying 10 times as much power as copper wires of the same size, while also being able to adapt automatically to power surges and disruptions from lightning strikes, heat waves and traffic accidents, even sabotage. "The IFCL superconducting cable being tested could well revolutionize power distribution to the country's critical infrastructure," said Dr. Roger McGinnis, Director of the Homeland Security Advanced Research Project Agency at S&T. "Eventually, these technologies will help incorporate localized clean, green electricity generation into the power grid." As for the science, the cables work by transmitting electricity with near zero resistance at higher temperatures than usual. But "high" is a relative term among superconductors. The cables conduct electricity at a chill -320°F instead of an icy -460°F for traditional superconductor cables. Holding and conducting energy better than traditional copper means these cables take up a fraction of the space. Manhattan's electrical workers may be able to eventually clear out the subterranean congestion beneath Wall Street that, amazingly, looks much the same today as it did a century ago. Since the cables themselves better prevent extremely high currents from cascading through the system, they will help eliminate the power surges that can permanently damage electrical equipment, similar to a breaker switch in a home, explained McGinnis. The cable switches off during a surge or failure, but automatically resets when conditions return to normal. For some context, electrical substations take electricity delivered over transmission and distribution lines and lower the voltage so it can be used by homes and businesses. Even if power is lost to an individual substation, by creating multiple, redundant paths for the electric current, the cables allow quick power restoration to all the surrounding power loads. Ultimately, these cables may allow substations that had been intentionally isolated from one another in the past, for fear of cascading failures, to be interconnected in order to share power and assets. Cutting-edge high temperature superconducting cables have been successfully tested in laboratories, and can be found in a handful of demonstration projects around the country, but they remain an emerging technology. S&T is interested in advancing the technology so that it can be used nationwide, and is pursuing an opportunity to connect two Con Edison Manhattan substations with the cable. The Department of Homeland Security hopes to enable the Department of Energy and various utility companies around the country to replace more than 2,000 circuit miles of power cables in U.S. cities with resilient, safe and green IFCL cables. Informatics Related Reads A New Way to Keep the Heart Pumping Water Resilience That Flows Shire's First Prescription Eye Drop Is Now Available in the U.S. Argonne-Led Projects Among $39.8M in First-Round Exascale Computing Project Awards
科技
2016-40/3982/en_head.json.gz/2578
Print Email Font ResizeNew iPhones make a splash with colors, priceBy Dan Nakaso and Troy Wolverton, San Jose Mercury NewsUpdated: CUPERTINO -- Confirming weeks of rumors, Apple executives on Tuesday unveiled a new, gold-colored iPhone 5S and a cheaper iPhone 5C designed to appeal to overseas markets. The iPhone 5C borrows a page from Apple's iPods and will come in multiple colors. Prices start at $99 for a 16GB model and $199 for a 32GB model -- both with two-year contracts. The 5C features a case made out of plastic, which Apple's design guru, Jony Ive, described as "beautifully, unapologetically polycarbonate."Advertisement Apple marketing head Phil Schiller called the 5C's higher-end brother, the iPhone 5S, the "most forward thinking phone ever" that's been designed to run both 32-bit and 64-bit apps and will include an upgraded camera along with a new fingerprint sensor built into the phone's home button that's intended to provide convenient security. Several analysts embraced Apple's upgraded 5S. New models of the Apple iPhone 5C on display in the Apple Store in Berlin, Germany, 10 September 2013. The introduction of the new Apple smartphones was held in Cupertino, California, USA, and screened live in the store in Berlin. (via Bay Area News Group) "You can't under-estimate how important security has become for consumers," said Tim Bajarin, president of Creative Strategies. "The camera clearly delivers a new set of features, larger pixels, a wide space for images and all these filters. It's just absolutely stunning. It'll make the iPhone 5S one of the best smart phone cameras available." Investors and advertisers also may be impressed by the new phone's 64-bit upgrade, which Bajarin called the "kind of new processing power that will allow software developers to create even more interesting and powerful applications, not just games. It'll provide a more intense experience and increase the speed of video and the quality." Apple unveils two new iPhones, the less expensive iPhone5C and a high-end upgrade, iPhone 5S. The 5C will come in new colors; the 5S debuts a fingerprint scanner in the home button, a faster CPU and major camera and video upgrades; a comparison. (TOBEY/The Washington Post) The iPhone 5S will cost $199 for a 16GB model, $299 for a 32GB version and $399 for a 64GB model -- all for two-year contracts. An "unlocked and contract-free" version carried over T-Mobile will be available for $549 for the 16GB version and $649 for a 32GB model. The iPhone 5S will come in silver, gold and "space gray." Pre-orders for the iPhone 5C and 5S will begin on Friday. They will be available for sale on Sept. 20. Apple also will keep its 8GB iPhone 4S, which will be available for free on a two-year contract. It's unclear how the public will react to the announcements. Some analysts said Tuesday's presentation offered no surprises following weeks of leaked media reports and Apple stock fell $11.53, or 2.28 percent Tuesday to close at $494.64. Shares dipped slightly in after-hours trading. "There were no surprises at all," said Bob O'Donnell, an analyst at technology research firm IDC. "Some people are going to be disappointed." The iPhone 5C is not a "cheap" version of the iPhone, noted Avi Greengart, an analyst with market research firm Current Analysis. "It's an iPhone 5, just made out of different material" Greengart said. Apple's announcements came as the company arguably needs another hit product. As a company, Apple's sales growth has slowed to a crawl and its profits have slumped. Meanwhile, its stock price, despite recovering recently, is still down more than 30 percent from the highs it set last year. While Apple's iPhone sales have held up better than its tablet and computer sales, they still have been hit by the slowdown in the company's business. And thanks to that slowing growth, Apple's market share in smartphones has slumped. In the second quarter, Apple held about 14 percent of the worldwide smartphone market, compared with about 19 percent a year earlier, according to Gartner. One of the attention-grabbing aspects of the iPhone 5S is its new level of security aimed at preventing anyone else from accessing the phone. Apple's fingerprint recognition "Touch ID" sensor is designed to scan through the sub-epidermal layer of skin. Fingerprint information will be encrypted and stored inside the A7 chip and will not be backed up to the iCloud or to Apple's servers, according to an Apple video. The Touch ID technology also can be used to make purchases at any of Apple's iPhone stores -- to buy books, music, movies and apps -- without entering a password. Forrester Analyst Frank Gillett called the new fingerprint security system "jaw droppingly easy" and "the first painless biometric I've seen." Tony Cripps, principal device analyst at Ovum, said, "Apple is certainly offering meaningful innovation here. Moving to a 64-bit architecture means Apple can genuinely claim to have brought something new to the smartphone party. It should certainly help the company further cement its lead as a mobile gaming platform and will give the Android fraternity something to think about in a space whose significance is sometimes downplayed beyond the gaming world." Apple executives began their presentation by announcing that the iOS7 operating system will be available for download on Sept. 18 for iPhone 4 models and above and for iPad 2 models and above.Print Email Font ResizeReturn to Top Welcome to your discussion forum: Sign in with a Disqus account or your social networking account for your comment to be posted immediately, provided it meets the guidelines. (READ HOW.)
科技
2016-40/3982/en_head.json.gz/2609
Yes, Online Privacy Really Is Possible ASU | NEW AMERICA | SLATE Learn more about Future Tense » SlateFuture TenseThe Citizen's Guide to the FutureFeb. 14 2014 6:19 PM By Eva Galperin and Jillian C. York You can protect yourself online. Photo by PATRIK STOLLARZ/AFP/Getty Images A few short weeks ago, we were conducting a security training for a group of journalists in Palestine. The journalists were deeply aware of the potential threats facing them—and by not one, but three governments—but didn’t have the first idea of how to mitigate against those threats. “It’s too confusing!” claimed one, while another said it was futile. Unfortunately, these reactions are all too typical. We’ve heard from a variety of populations all over the world. Despite all of the awareness-raising around surveillance that has taken place over the last year, many individuals feel disempowered, helpless to fight back. Efforts such as the February 11 initiative the Day We Fight Back aim to empower individuals to lobby their representatives for better regulation of mass surveillance. But legislation and policy are only part of the solution. In order to successfully protect our privacy, we must take an approach that looks at the whole picture: our behavior, the potential risks we face in disclosing data, and the person or entity posing those risks, whether a government or company. And in order to successfully fight off the feeling of futility, we must understand the threats we face. Advertisement In a recent piece for Slate, Cyrus Nemati hems and haws over the complexities of creating a private online existence, ultimately choosing to give up on Internet privacy and embrace the convenience of sharing. While working at an organization that advocates for digital rights, Nemati found himself anxious about his personal privacy and took steps that made browsing “a chore”; later, after getting married and wanting access to social tools, he claims he “learned … to love a less private Internet.” The truth is that most of us simply can’t protect ourselves from every threat 100 percent of the time, and trying to do so is a recipe for existential dread. But once we understand our threat model—what we want to keep private and whom we want to protect it from—we can start to make decisions about how we live our lives online. You’ll find yourself empowered, not depressed. Threat modeling is an approach undertaken by the security community. It looks at the specific circumstances of the individual and the potential threats facing him or her and makes a diagnosis (and a prescription) on that basis. Threat modeling looks at what a person has to protect (her assets), who she has to protect those assets from (her threat), the likelihood that she will need to protect them, her willingness to do so, and the potential consequences of not taking precautions. A teacher in suburban California doesn’t have the same set of online privacy concerns than a journalist in Palestine. And the kinds of steps the teacher might take to protect his personal photos from nosey students and their parents are quite different from the precautions the journalist might take to protect her anonymous sources from being identified by the government. Some us don’t want our Internet browsing habits tracked by companies like Google and Facebook. Some of us don’t want the NSA reading our emails. But without enumerating our threats and our assets, it’s easy to choose tools that are inappropriate or unnecessary to the task at hand. The schoolteacher probably doesn’t need to PGP-encrypt his email or run every privacy-enhancing app and plugin, like Nemati did in his privacy hipster phase. The journalist might find that taking the time to use PGP gives her peace of mind. Nemati’s frustration may not have come from failing to list his threats and assets as much as may have come from misidentifying them. He writes that he “treat[ed] himself like a criminal, obsessed with keeping a very low online profile”—a perfect recipe for frustration, bearing little to no resemblance to how an actual criminal might behave. A successful criminal understands his threat—law enforcement—and recognizes the steps he needs to take to evade them, which may or may not include keeping a low profile online. Nemati might instead face the threat of his parent, spouse, or boss viewing his online activity and work to hide those activities from them. He might also be worried about criminals who want to steal his login credentials and gain access to his bank account. This requires an understanding of security settings, social media, and browser privacy settings, for sure, but not the elaborate privacy kabuki Nemati describes. Don’t get us wrong: We’re sympathetic to Nemati and the many Internet users like him we meet every day. But we also know that the choice between a crippled Internet experience and an Internet in which privacy is a mere afterthought is a false one. Instead of heading down the rabbit hole of deep paranoia and subsequent nihilism, we recommend that you tackle the task of becoming safer online the way you would any other task: step by step. By starting slow and building on your repertoire of tools, you can protect yourself. For a list of 10 things you can do right now to protect yourself against surveillance, check out this blog post from the EFF, where we work. Total privacy on the Internet may not be possible, but meaningful privacy is within your reach. And you don't have to go crazy trying to achieve it Future Tense is a partnership of Slate, New America, and Arizona State University. Eva Galperin is a global policy analyst for the Electronic Frontier Foundation. Her work focuses on providing digital privacy and security for vulnerable populations. Jillian C. York is the director for international freedom of expression at the Electronic Frontier Foundation.
科技
2016-40/3982/en_head.json.gz/2610
The Worst Case Scenario Has Come True: California’s Snowpack Is Now Zero Percent of Normal SlateThe SlatestYour News CompanionMay 29 2015 2:56 PM California’s Snowpack Is Now Zero Percent of Normal By Eric Holthaus A stump sits at the site of a manual snow survey on April 1, 2015 in Phillips, California. The current recorded level is zero, the lowest in recorded history for California. Photo by Max Whittaker/Getty Images California’s current megadrought hit a shocking new low this week: On Thursday, the state’s snowpack officially ran out. At least some measurable snowpack in the Sierra mountains usually lasts all summer. But this year, its early demise means that runoff from the mountains—which usually makes up the bulk of surface water for farms and cities during the long summer dry season—will be essentially non-existent. To be clear: there’s still a bit of snow left, and some water will be released from reservoirs (which are themselves dangerously low), but this is essentially a worst-case scenario when it comes to California’s fragile water supply. This week's automated survey found California's statewide snowpack had officially run out. California Department of Water Resources Advertisement The state knew this was coming and has been working to help soften the blow—but they’re fighting a losing battle. Bottom line: 2014 was the state’s hottest year in history, and 2015 is on pace to break that record. It’s been too warm for snow. Back in April, Gov. Jerry Brown enacted the state’s first-ever mandatory water restrictions for urban areas based mostly on the abysmal snowpack. In recent days, the state’s conservation efforts have turned to farmers—who use about 80 percent of California’s water. With a burgeoning El Niño on the way, there’s reason to believe the rains could return soon—but not before October or November. The state’s now mired in such a deep water deficit that even a Texas-sized flood may not totally eliminate the drought. Welcome to climate change, everyone. Eric Holthaus is a meteorologist who writes about weather and climate for Slate’s Future Tense. Follow him on Twitter.
科技
2016-40/3982/en_head.json.gz/2684
Introducing nanotechnology to industries Ananda KANNANGARA Science and Technology Minister Prof. Tissa Vitarana stressed the importance of introducing Nanotechnology to industries and said fund controllers in the country must extend their fullest co-operation to develop the technology. He was speaking at a seminar on `Tapping the World of Nanotechnology�, organised by the National Science Foundation and Small and Medium Enterprise Developers of the Federation of Chambers of Commerce and Industry in Sri Lanka (FCCISL). Nanotechnology is a branch of engineering that deals with the design and manufacture of small electronic circuits and mechanical devices that are built at the molecular level of matter. The Minister also said that there was a major economic crisis in the world and Sri Lankan could take advantage of this situation by using Nanotechnology. The Minister thanked the government for releasing a 60-acre land at Homagama to set up a Nanoscience Park with facilities for research and development in Nanotechnology. He also said that the Sri Lanka Nanotechnology Institute, at Biyagama is a public-private partnership and it would help to conduct research activities and also apply Nanotechnology for the advancement of technologies. Dr. Rohan Munasinghe of the Moratuwa University, Engineering Faculty said that governments and industries are investing on research and development in Nanotechnology, since it is an interdisciplinary field that encompasses physics, chemistry, biology and engineering. Prof. Veranga Karunaratne of the Sri Lanka Institute of Technology said that Nanotechnology is at the infant stage in our country and scientists could contribute to develop technologies. EMAIL | PRINTABLE VIEW | FEEDBACK Gamin Gamata - Presidential Community & Welfare Service
科技
2016-40/3982/en_head.json.gz/2896
Home / Science News Manta rays threatened by fishermen Nov. 24, 2012 at 2:53 PM Follow @upi Comments RAJA AMPAT, Indonesia, Nov. 24 (UPI) -- Marine scientists say they are working to save a population of manta rays off the coast of Indonesia. Manta rays, abundant around Raja Ampat, eastern Indonesia, were listed last year as "threatened" under the International Convention on the Conservation of Migratory Species in 2011, NBC News reported. Scientists say mantas are being caught as bycatch, getting caught in industrial fishing nets targeting different types of tuna and, increasingly, for their gill rakers -- which allow them to filter food from water, and are used in traditional Chinese medicine. A report called Manta Ray of Hope found an estimated 3,400 manta rays and 94,000 mobulas, which are related to the manta ray, are caught each year, but the numbers reflect only reported catches. "Unreported and subsistence fisheries will mean true landings are much higher," the report said. Scientists in China are working to have manta rays protected by the government. "In the last two years, we have conducted evaluations of the manta ray and submitted a recommendation to the government to list it as a protected species," said Professor Wang Yanmin from the Chinese Shandong University's Marine College. Feng Yongfeng, founder of Green Beagle, a group that promotes environmental protection, said, "There is no regulation for protecting the manta ray so sales of mantas are not illegal." "They're such an iconic species, beloved by divers," said Andrea Marshall, director of the Marine Megafauna Foundation. "They're just amazing." Fishermen turn to rays as sharks decline Maldives manta rays threatened by tourism Scientists track manta rays by satellite Indonesia to protect two of world's largest manta ray species
科技
2016-40/3982/en_head.json.gz/2897
Home / Science News / Technology Stephen Hawking: Dismissing artificial intelligence would be a mistake Scientists say not enough research being done on effects of artificial intelligence. By Danielle Haynes | May 3, 2014 at 2:40 PM Follow @upi Comments | License Photo LONDON, May 3 (UPI) -- Stephen Hawking, in an article inspired by the new Johnny Depp flick Transcendence, said it would be the "worst mistake in history" to dismiss the threat of artificial intelligence. In a paper he co-wrote with University at California, Berkeley computer-science professor Stuart Russell, and Massachusetts Institute of Technology physics professors Max Tegmark and Frank Wilczek, Hawking said cited several achievements in the field of artificial intelligence, including self-driving cars, Siri and the computer that won Jeopardy! "Such achievements will probably pale against what the coming decades will bring," the article in Britain's Independent said. "Success in creating AI would be the biggest event in human history," the article continued. "Unfortunately, it might also be the last, unless we learn how to avoid the risks." The professors wrote that in the future there may be nothing to prevent machines with superhuman intelligence from self-improving, triggering a so-called "singularity." "One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all," the article said. "Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues outside non-profit institutes such as the Cambridge Centre for the Study of Existential Risk, the Future of Humanity Institute, the Machine Intelligence Research Institute, and the Future of Life Institute. All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks." Notre Dame alumnus giving university record $75 million donation Tech companies defy government and will notify users of secret data demands GenDyn UK contracted to support military radio system US nuclear arsenal still controlled by floppy disks Topics: Stephen Hawking, Johnny Depp Latest Headlines
科技
2016-40/3982/en_head.json.gz/2924
News Scientists Solve Mystery of Brilliant Northern, Southern Lights October 27, 2009 12:12 PM Scientists have solved the mystery behind the brilliant northern and southern light show known as Aurora Borealis. The phenomenon is caused by electromagnetic energy from the sun, which experts say also wreaks havoc on ground-based power grids and satellites. VOA's Jessica Berman reports.Just like atmospheric conditions can affect weather on the ground, experts say the sun is responsible for weather in outer space. They say the Sun's atmosphere emits high energy solar winds that bathe the Earth continuously with electromagnetic energy. Nicola Fox is with the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland. "You could really think of it as us living in the atmosphere of the Sun," Fox explained. "So, if the Sun changes, the Earth will feel its effects. So, if the Sun sneezes, the Earth will catch a cold."But until now, space scientists have been unable to pinpoint the source of energy releases in the earth's atmosphere that are responsible for the spectacular light show, aurora borealis, in the extreme northern and southern latitudes. The same energy releases are responsible for dangerous sub-storms that disrupt ground-based power grids and communications systems. Scientists at the University of California at Los Angeles discovered the source of the space blasts using five satellites of the U.S. space agency's THEMIS program.The researchers explain that the Sun's and Earth's electromagnetic fields normally glide past one another in many different directions. But when enough energy builds between the two fields, they snap and right themselves in a process scientists call reconnection.David Sibek is THEMIS project scientist with the U.S. space agency NASA. He says reconnection releases a huge amount of electrical current into the magnetosphere that surrounds the planet."When reconnection occurs, that current is broken and it flows down to the Earth so you have like a short-circuit out in the Earth's magnetic field," explained Sibek. "And it's that current that's going to power the aurora and dump into the Earth's ionosphere and cause power line disruption in Canada for example by blowing out transformers."Scientists say it is important to know about sub-storms in order to take measures to protect valuable technical equipment, and possibly the lives of spacewalking astronauts.The discovery of the mechanism behind sub-storms is reported in the journal Science. Related Scientists Describe Formation of First Stars Scientists Link Three Genes in Schizophrenia Total Eclipse Draws Crowds to Russia, China Watch NASA video on Auroras and THEMIS satellites
科技
2016-40/3982/en_head.json.gz/3073
NASA Administrator NASA Administrator Charlie Bolden's Blog NASA’s New Neighbor on the National Mall: Reflections on the Opening of the Smithsonian’s National Museum of African American History and Culture Posted on September 23, 2016 at 5:09 pm by Stephen Fox. Tomorrow, the National Museum of African American History and Culture opens its doors to the public. Located on the National Mall, the museum is less than a mile away from NASA’s Washington Headquarters. Recently I spoke to NASA TV about the significance of this occasion, and you can watch a few clips from our conversation below. On a personal note, I think it’s critically important – and it’s really impressive — that at long last we’re going to have a museum on the Mall that’s dedicated to people of African decent here in the United States. I never in my wildest dreams growing up in Columbia, South Carolina during segregation would have believed that I would be experiencing the opening of a museum dedicated to African American history and culture – let alone during the Administration of America’s first Black president, whom I have the privilege of serving under as NASA’s first African American Administrator. Because I believe so strongly in its mission, I donated a few personal items to the museum, including my flight suit and mission patch from STS-60, my final space flight and a critical mission to what would become the International Space Station program; a model of the Hubble Space Telescope – what I believe to be the most incredible scientific instrument that humanity has created; rugby shirts from STS-45 and STS-31, among other items. It’s my hope that young people who visit the museum will be encouraged to reach for new heights in their own lives, and use their dreams as inspiration to work hard, study hard, and refuse to be deterred by failure. The reason that I applied for the Astronaut program many years ago, was that the late, great Dr. Ron McNair – himself a hallmark figure in both African American history and the history of America’s space program — encouraged me to go for it. It is my hope that the objects and displays in this museum will have the same sort of impact on a new generation of future astronauts, artists, engineers, educators, physicists, philosophers, physicians and so forth. Being the first African American Administrator is all well and good, but I want to make sure I’m not the last. Encouraging more young people from underserved communities to study the STEM subjects of science, technology, engineering and math is one important way to make sure of this. Reminding the next generation that even the sky is not the limit is another. Best wishes to everyone involved in opening this important new museum. I cannot wait to visit! NASA Administrator Charles Bolden discusses the flight jacket from mission STS-60 that he donated to the new National Museum of African American History and Culture. NASA Administrator Charles Bolden talks about the advice he gives to young people,on the occasion of the opening of the National Museum of African American History and Culture. NASA Administrator Charles Bolden discusses the historic significance of the new National Museum of African American History and Culture. This entry was posted in Uncategorized on September 23, 2016 by Stephen Fox. NASA Uses Federal Employee Viewpoint Survey to Improve Performance Posted on September 20, 2016 at 8:11 am by Administrator Charles Bolden. NASA is proud to have been named the “Best Place to Work” in the Federal Government (among large agencies) for the past four consecutive years by the Partnership for Public Service. Using the Federal Employee Viewpoint Survey (FEVS) as a focal point for guidance, over time we have developed a positive work culture with a high level of employee engagement through deliberate, proactive initiatives. I’ve always told our employees that their voices matter. At NASA, it’s especially critical, as much of our work is difficult and dangerous, and sometimes lives are in the balance. We must have a culture where speaking up and providing feedback is encouraged. I’ve made nurturing that culture a centerpiece of my leadership, and we created a Workforce Culture Strategy to communicate and codify these values. With some 18,000 employees at NASA, getting feedback can be daunting, and the FEVS helps provide a vehicle where people feel they can be candid and offer constructive comments without putting themselves or their jobs at risk. We use it to help offices within our organization to improve and to share their successes. At NASA, we consider ourselves a family and, like any family, there can be some bumps in the road. The FEVS helps us get past them. Based on last year’s employee feedback, we focused this year on second-level performance reviews to support and encourage fairness in ratings, and we created a Leader’s Handbook to guide supervisors and employees, and to foster organizational health. I’m still listening – and feel privileged to be working with such a talented, creative workforce. The best part of serving as NASA Administrator continues to be witnessing how open and honest opinions and ideas have changed NASA for the better. Our entire NASA senior leadership team sincerely cares about our workforce’s opinions and is ready to take action. I want to thank my colleagues and their teams for using the FEVS to make progress on employee engagement. I know agencies across government are using this important tool to make similar strides. All of us need to work each and every day to make sure the talented people who work for the Federal Government feel valued, included, and engaged in their jobs. This entry was posted in Uncategorized on September 20, 2016 by Administrator Charles Bolden. Kibo: Our Shared Destiny Will Be Written by Us, Not for Us Posted on August 4, 2016 at 11:00 am by Administrator Charles Bolden. This week, I embarked on a visit to Japan for discussions with a variety of senior Japanese government officials about our mutual interest in space exploration. I will also visit NASA’s outstanding partners at JAXA, the Japan Aerospace Exploration Agency. With more than 30 active agreements in place, NASA and JAXA have one of the strongest, most comprehensive and longest lasting space bilateral relationships of any two nations in the world. One of the greatest illustrations of this partnership is the International Space Station (ISS), orbiting 250 miles (400 kilometers) above the Earth at 17,500 miles per hour (about 28,000 kph) with six astronauts on board as I write this! NASA’s Journey to Mars is taking shape aboard the orbiting laboratory, where astronauts from different countries are working together to advance research and technology that will allow future astronauts to travel deeper into space, at the very same time we create jobs and improve our quality of life here on Earth. Japan and the United States are working together aboard the Space Station with many other international partners – and we will be for the foreseeable future. Today, Japanese astronaut Takuya Onishi and American astronauts Jeff Williams and Kate Rubins are living and working together with their Russian crewmates at the cutting edge of innovation, science and discovery. Their research ‘Off the Earth, For the Earth’ promises to deepen understanding and expand human progress around such areas as medicine, biology, technology, Earth science, material production and communications – and that’s just the short list! Because leaders in both the U.S. and Japan have chosen to extend our Space Station participation through at least 2024, the promise and potential progress that comes out of this research will continue for years to come. In the more immediate future, the research benefitting all of humanity will be bolstered by cargo delivered to station aboard Japan’s upcoming HTV-6 mission (which, as was announced recently, is set to launch in October of this year). As we consider the bright future of our partnership, I’m very much looking forward to joining our friends at JAXA this week for a ceremony to officially open a control room for Kibo, the Japanese Experiment Module on ISS. Kibo is, appropriately, a Japanese word meaning “hope,” and I believe that “hope” is an excellent description of the research that’s being conducted aboard the International Space Station and the cooperation that goes into it. President Obama once said that “hope is the belief that destiny will not be written for us, but by us, by the men and women who are not content to settle for the world as it is, who have the courage to remake the world as it should be.” The International Space Station is the embodiment of this sort of hope and effort. Consider this: more than 220 human beings from 18 countries have visited the International Space Station; tens of thousands of people have been involved in its construction and operation; and people from dozens of countries have had their research and experiments flown aboard it. As we look forward to an exciting future exploring space, I am also enthusiastic about advances the U.S. is making in airspace travel a little closer to Earth. We are in the midst of an incredible moment in the history of aeronautics. With President Obama proposing an historic investment in green aviation, we have an opportunity to make air travel cleaner, greener, safer and quieter – even as our skies grow more crowded and aircraft fly faster. One of the more important areas of NASA aeronautics research is air traffic management. Our country’s skies will have to absorb an estimated four billion more passengers over the next several decades and it’s essential that we do this without compromising the safety of our skies. We in the United States are not the only country with an interest in building a more efficient air traffic management system. International commerce depends on air transportation and it is imperative that we work together with partner countries around the world to maximize human resources and investment for the benefit of all humanity. With this in mind, after my visit to Japan I plan to travel to China to discuss areas of mutual interest in aviation research between NASA and the Chinese Aeronautical Establishment (CAE). This will be part of ongoing conversations that began in November of 2014 and have continued through a NASA-CAE workshop in Beijing that was held in August 2015. Taken together, our partnerships around the world continue to instill optimism – and inspire hope – about the future of space exploration, aeronautics and our ability to write our own destiny – together. This entry was posted in Uncategorized on August 4, 2016 by Administrator Charles Bolden. Bringing Humans to Mars and Humanity Together Posted on June 3, 2016 at 3:10 pm by Administrator Charles Bolden. NASA’s Journey to Mars is about more than sending American astronauts to the Red Planet in the 2030s; it’s about bringing people together here on Earth. It’s about strengthening the American economy and with it the economic security of families throughout our country. It’s also about strengthening our friendships across sectors and also across national borders. This is why I’m fond of reminding virtually every audience to whom I speak that sending humans to Mars requires all hands on deck – government, industry, academic and international partners and citizen scientists – we need everybody. Today, I’m embarking on a journey of my own — to meet with our global friends in international space agencies, governments, private companies, universities and other forums; folks who are eager to be part of NASA’s Journey to Mars. I plan to carry with me a message of partnership as I remind them of how much the American people value their friendship, especially when it comes to space – which in many ways is the great global connector. It should not be lost on any of us that for the last decade and a half, human beings from multiple countries have been living and working together on the International Space Station (ISS) in common pursuit of human progress. It certainly is not lost on me, that a girl or boy age 15 or younger has lived every single second of every day of her or his life while human beings have been living and working together in space. Our grandchildren’s children may very well live every day of their own lives while human beings are living and working together on Mars. For this reason, I’m a firm believer in the soft power that our country is able to demonstrate when we engage in space diplomacy. From our perspective at NASA, one of the most gratifying developments over the past few years has been the increasing number of nations who have joined the global exploration endeavor. Nations large and small, both with and without formal space agencies, have all come to the conclusion that everyone who has a passion for space can find a role and a place where their expertise is critical. In short, every single nation can play a part in our journey to Mars, in our scientific journey of discovery and in the next phase of humanity’s development as a spacefaring people. Over the course of this trip, I will have the opportunity to discuss NASA’s Journey to Mars with the Israeli Minister of Science, Technology and Space, the Israel Space Agency (ISA), and Israeli innovators, students and entrepreneurs. I’ll also be meeting with students in both Israel and Jordan who participate in the Global Learning and Observations to Benefit the Environment (GLOBE) science and education initiative, of which NASA is a proud partner. I’ll also be traveling to the United Arab Emirates (UAE) to meet with colleagues at the UAE Space Agency. I’ll wrap up this trip with a meeting with NASA partners in the European Space Agency (ESA) at the ESA Council in Paris. We recognize that NASA provides inspiration to dreamers and doers of all professions everywhere around the world, so we are looking forward to partnering with the U.S. Embassy in Amman and His Royal Highness Crown Prince Al Hussein bin Abdullah II to host a public dialog about NASA’s Journey to Mars while I am in Jordan. Everywhere I travel, I meet people who are looking to the United States for leadership when it comes to space exploration. Time and again I hear enthusiasm about our Journey to Mars and an appetite for partnership in this remarkable pursuit of progress and possibility. Together, we can bring humanity to the face of Mars and reach new heights for the benefit of all humankind … and we will. This entry was posted in Uncategorized on June 3, 2016 by Administrator Charles Bolden. NASA, NOAA Analyses Reveal Record-Shattering Global Warm Temperatures in 2015 Posted on January 20, 2016 at 4:33 pm by Administrator Charles Bolden. By NASA Administrator Charles Bolden and NOAA Administrator Kathryn Sullivan Climate change is one of the most pressing issues of our generation and it affects every person on Earth. Tracking the changes in the global climate is the basis for understanding its magnitude and extent. Today’s announcement that NASA and NOAA scientists have determined that 2015 was the hottest year in recorded history underscores how critical Earth observation is. The NOAA-NASA collaboration has served the country very well, from the origin of space-based remote sensing for weather forecasting to the Earth system monitoring and science that are so crucial to tackling the issues of our times. This announcement is a key data point that should make policy makers stand up and take notice — now is the time to act on climate. The modern temperature record dates back to1880, and 2015 was the warmest year by a long shot. There has been a lot of talk about the strengthening El Niño in the Pacific Ocean and how that might be supercharging temperatures. El Niño did likely play an important role – but more significantly, 2015’s record temperatures are the result of the gradual, yet accelerating, build-up of carbon dioxide and other greenhouse gases in Earth’s atmosphere. Scientists have been warning about it for decades and now we are experiencing it. This is the second year in a row of record temperatures and what is so interesting is that the warmest temperatures often occur the year after an El Nino, like in 1998 compared to 1997. Fifteen of the 16 warmest years on record have now occurred since 2001. Temperatures will bounce around from year to year, but the direction of the long-term trend is as clear as a rocket headed for space: it is going up. This record-breaking moment is a good time to take stock of what we know of our changing planet and why it is important for NASA, NOAA and other federal agencies to continue studying Earth’s climate and how it is changing: Sea levels are rising – nearly three inches in the past two decades alone. The successful launch earlier this week of the NOAA-led Jason-3 mission will continue our 23-year record of measuring sea level change from space with remarkable precision. In the coming years and decades, our work to understand how quickly seas are rising will be vital to coastal cities in the U.S., millions of people around the world who live in low-lying areas, and to NASA’s own facilities at Kennedy Space Center, where we will one day launch astronauts to Mars, and other affected facilities such as the Stennis Space Center, Wallops Flight Facility and Michoud Assembly Facility. The Arctic ice cap is shrinking. In the 1970s and 80s, NASA scientists pioneered techniques to measure the extent of sea ice at the poles. That new ability quickly gave way to the realization that the Arctic ice cover – which plays a significant role in the planet’s climate and even the weather we experience in the U.S. – is retreating and growing thinner. NOAA’s global drifting buoy program and other NOAA and international ocean temperature and land surface temperature measurements have provided the means to measure the temperature at the Earth’s surface, so critical to our survival. Ice sheets and glaciers worldwide are shedding ice. Greenland is losing about 300 billion tons of ice per year, according to measurements from NASA’s GRACE mission. Observations from the agency’s Operation IceBridge have helped confirm rapidly accelerating changes in the West Antarctic Ice Sheet and the dramatic retreat of glaciers in Alaska. Given the pace of these changes and their significance for the climate and sea level rise, we need close and continuous monitoring. In 2017, NASA will launch two missions – GRACE-FO and ICESat-2 – that represent a major refresh of our capabilities to observe how ice sheets and glaciers are changing. Rising temperature is not an isolated effect of rising greenhouse gas levels, and scientists are still studying the full implications of a warmer world. How might patterns of drought and precipitation change? Will ecosystems and species be able to adapt to human-induced climate change? What might these changes mean for wildfires, agriculture and the economy? Climate change isn’t a problem for the future. Earth’s climate is changing now. At NASA, we use our unique vantage point from space to study the planet as a whole system. NOAA’s scientists are on the ocean, land and in the sky collecting data that help bring clarity. Our job is to answer these kinds of questions, to make the measurements needed to get to those answers and to provide our knowledge and our data freely so the world can address this fundamental challenge. This entry was posted in Uncategorized on January 20, 2016 by Administrator Charles Bolden. Building a Robust Commercial Market in Low Earth Orbit Posted on January 14, 2016 at 2:58 pm by Administrator Charles Bolden. NASA is on a Journey to Mars and a new consensus is emerging around our plan, vision and timetable for sending American astronauts to the Red Planet in the 2030s. Our strategy calls for working with commercial partners to get our astronauts and cargo to the International Space Station while NASA also focuses – simultaneously — on getting our astronauts to deep space. Few would have imagined back in 2010 when President Barack Obama pledged that NASA would work “with a growing array of private companies competing to make getting to space easier and more affordable,” that less than six years later we’d be able to say commercial carriers have transported 35,000 of pounds of space cargo (and counting!) to the International Space Station (ISS) – or that we’d be so firmly on track to return launches of American astronauts to the ISS from American soil on American commercial carriers. But that is exactly what is happening. Since the first SpaceX Dragon commercial resupply mission to deliver cargo to the ISS in October 2012 and Orbital ATK’s first Cygnus mission in January 2014, American companies have delivered cargo to the Space Station that enables our astronauts to work off Earth for Earth on extensive and ongoing scientific research and technology demonstrations aboard the Space Station. This has included investigations that directly benefit life on Earth and expand commercial access to microgravity research through the U.S. National Laboratory (which is operated by the Center for the Advancement of Science in Space or CASIS). All this matters because NASA research helps us understand our home planet as well as the solar system and beyond, while technology demonstrations and human health research like astronaut Scott Kelly’s one-year mission and the Twins Study aboard the Space Station prepare us for long-duration missions into deep space. As a result, we are closer than ever before to sending American astronauts to Mars and at the very same time, we’re “insourcing” American jobs and empowering American entrepreneurs and innovators to expand the nascent commercial market in low-Earth orbit. Today, thanks to the bold plan laid out by the President, Americans are working at more than 1,000 companies in nearly every state in the Union on NASA commercial space initiatives. Across the board, about 80% of NASA’s activities are carried out by our partners in industry and at America’s academic institutions. We develop more than 1,600 new technologies a year and work with business partners to transfer thousands of products, services and processes into the market for job creation and economic growth. More venture capital was invested in America’s space industry in 2015 than in all the previous 15 years combined. In other words, at NASA we’re exploring deep space, but we’re anchored right here on Earth, where we’re creating jobs and fueling innovation, technology development and growth, recognizing that it all depends on American ingenuity and innovation. With the recent passage of the FY2016 federal budget and our selection of Robert Behnken, Sunita Williams, Eric Boe and Douglas Hurley to be the first NASA astronauts to train to fly to space on commercial crew vehicles, we are close to returning human launches to American soil and ending our sole reliance on the Russians to get into space. In addition, the commercial crew spacecraft will enable us to add a seventh crew member to the normal Space Station crew complement, effectively doubling the amount of crew time available to conduct research off Earth for Earth. The additional research (and crew supplies) will be delivered during cargo resupply missions. A NEW MILESTONE Despite critics who may have said this was a pipe dream just five short years ago, we continue to transform the way NASA does business and as a result, today we’re able to mark another significant milestone that will carry President Obama’s vision further into the future. This afternoon, our ISS team in Houston will announce that NASA is making its new award for commercial space cargo delivery to the ISS. This is a big deal, because our commercial resupply missions enable NASA and our private industry and other government agency partners to continue the extensive, ongoing scientific research aboard the Space Station. President Obama extended the life of the International Space Station through at least 2024 (with the support of Congress) and our commercial cargo providers ensure cargo resupply missions continue, enabling us to keep using the station as our springboard to the rest of the solar system and a test bed for human health in space. Today’s selection builds on our initial resupply partnerships. It will ensure that NASA maintains the capability and flexibility to operate the ISS and conduct the vital research of a unique National Lab through resupply services launching from the United States. As President Obama said, “in fulfilling this task, we will not only extend humanity’s reach in space — we will strengthen America’s leadership here on Earth.” Our investment in commercial space is creating jobs and it’s bringing us closer to sending American astronauts to Mars. Competition, innovation and technology – it’s the American way. It’s helping us to Launch America. This entry was posted in Uncategorized on January 14, 2016 by Administrator Charles Bolden. NASA’s Work to Understand Climate: A Global Perspective Posted on December 4, 2015 at 3:40 pm by Administrator Charles Bolden. NASA is uniquely positioned to study our home planet, and Earth observation has been at the core of the agency’s work since our founding. In addition to a fleet of amazing satellites that we and our international partners use to study our planet in a range of wavelengths, and across the spectrum of planetary features from oceans to atmosphere and ground cover, the International Space Station is also rapidly becoming a significant platform to study Earth. Our work has global implications. This week, a small delegation of NASA leaders have been participating with a larger U.S. delegation at the 21st session of the U.N. Framework Convention on Climate Change (UNFCCC) Conference of Parties, also known as COP-21. COP-21 will bring nearly 200 nations together to reach an agreement on limiting climate change. Global climate change, driven by rising levels of carbon dioxide and other greenhouse gases in the atmosphere, represents a fundamental challenge to the U.S. and the world. It is the challenge of our generation. While NASA has no formal role in the COP-21 climate policy talks, the agency is hard at work providing the nation and the world the best information possible about how Earth is changing. Regardless of what world leaders decide in Paris, our job is to build an understanding of the whole planet now and what it will look like in the future. NASA’s comprehensive study of Earth has provided much of the underlying understanding of current trends in the planet’s climate – including definitive measurements of rising sea levels, glacier retreat, ice sheet changes and the decline in the volume of the Arctic sea ice cap. Our satellites have provided global, long-term views of plant life on land and in the ocean. And our supercomputing power is allowing us to better understand how all the parts of the Earth system work together and help us to predict how this could change. We will continue to monitor climate trends and investigate other ways in which the planet is ultimately responding to increasing greenhouse gas levels. We have discovered more than a thousand planets outside of our solar system, but none yet match Earth’s complexity. That’s one reason we have more satellites orbiting Earth than any other planet. We made a significant expansion of the Earth-observing fleet in 2014 and 2015, launching missions that are making unprecedented measurements of rainfall and snow (Global Precipitation Measurement), carbon dioxide in the atmosphere (Orbiting Carbon Observatory-2), and soil moisture (Soil Moisture Active Passive). Soon, with the help of NOAA, the French Space Agency CNES, the European Organisation for the Exploitation of Meteorological Satellites -EUMETSAT and SpaceX, we will launch the Jason-3 mission to continue building on the vital, two-decade record of how much and where global sea level is changing. The view from space is incredible – seeing our planet from orbit is one of the highlights of my life — but sometimes we need to get in a little closer. So in the 2015 and throughout 2016, NASA is sending scientists on expeditions to all corners of the planet – by plane, by ship and even by foot – to get an on-the-ground look to help answer some important science questions. How are warming ocean waters melting Greenland glaciers and adding to sea level rise? How are the world’s coral reefs responding to changes in the ocean? What will rapidly warming temperatures in the Arctic mean for the greenhouse gases stored in forests and permafrost? Our scientists are putting together multi-year campaigns that will complement our space-based perspective. Consider it planetary exploration right here at home. Global meetings like COP-21 are important for discussion and policymaking, and NASA will continue the day to day work of monitoring our Earth observation satellites and making their wealth of data available to people across the globe. There’s no more important planet for us to understand. This entry was posted in Uncategorized on December 4, 2015 by Administrator Charles Bolden. President Obama Meets With Space Pioneers Posted on October 21, 2015 at 10:38 am by Administrator Charles Bolden. Monday, the stars were out at the White House — literally — as more than 100 students joined President Obama, twelve astronauts, scientists, engineers, teachers, and space enthusiasts — along with Americans participating virtually from more than 80 national parks, observatories, schools, museums, and astronomy clubs across our country — White House Astronomy Night. President Barack Obama greets NASA Commercial Crew astronauts: Robert Behnken, Eric Boe, Douglas Hurley and Sunita Williams, NASA Administrator Charles Bolden, and NASA Deputy Administrator Dava Newman, in the Map Room before White House Astronomy Night on the South Lawn of the White House, Oct. 19, 2015. (Official White House Photo by Pete Souza) Some of the brightest stars of the night weren’t celestial in nature. Rather, they are four space pioneers: astronauts Robert Behnken, Sunita Williams, Eric Boe, and Douglas Hurley. These distinguished veteran astronauts are blazing a new trail, a trail that will one day land them in the history books. NASA selected these four, who privately met with the President earlier in the evening, to be the first astronauts to train to fly to space on commercial crew carriers. It’s an important step on our Journey to Mars, and for President Obama’s ambitious plan to once again launch U.S. astronauts into space from U.S. soil and to create good-paying American jobs in the process – 350 American companies across 35 states are working toward this goal. For as long as I’ve been Administrator, President Obama has made it very clear that returning the launches of American astronauts to American soil is a top priority. Five years ago, when the President came to the Kennedy Space Center in Cape Canaveral, Florida, to ask NASA to work toward sending American astronauts to Mars in the 2030s, he talked about being inspired as a young boy when his grandfather lifted him on his shoulders so he could cheer on astronauts arriving in Hawaii. His hope – and, really, all our hope – is that a new generation of young Americans will be inspired by people like Bob, Suni, Eric, and Doug to reach for new heights, both in their own lives and in the life of our nation. Today’s young people are a part of what I like to call the “space generation.” Those who are younger than 15 have lived every day of their lives in a time when American astronauts are living and working in space aboard the International Space Station. Our goal is to give them a future where Americans are pushing further into the solar system at the very same time that our Nation strengthens our leadership here at home. President Obama’s commercial crew vision represents a giant leap into this future. #AskNASA Chat with NASA commercial crew astronauts. Photos from Astronomy Night 2015. Video of the President’s remarks at Astronomy Night. This entry was posted in Uncategorized and tagged Astronomy Night, Bolden, Commercial Crew, Obama on October 21, 2015 by Administrator Charles Bolden. Mars: A Journey We Will Take Together Posted on October 13, 2015 at 8:33 am by Lauren Worley. Nearly everywhere I travel, I meet people who are excited to learn more about NASA’s Journey to Mars and NASA’s plan, timetable and vision for getting there. This past week, we released a detailed outline of our plan – a clear, affordable sustainable, roadmap for sending our astronauts to Mars in the 2030s. It’s called “NASA’s Journey to Mars: Pioneering Next Steps in Space Exploration” and I hope you’ll take a moment to give it a look, here. A Journey such as this is something that no one person, crew, or Agency can undertake alone. As I like to tell the young people with whom I meet, it will take not only astronauts, scientists and engineers, but also, physicists, physicians, programmers, poets, teachers, designers, human capital professionals, entrepreneurs and parents who talk to their kids and get them excited about space. It will take folks working both in and out of government. A mission of this magnitude is made stronger with international partnership – the sort of spirit and cooperation that is demonstrated so vividly by the tens of thousands of people across 15 countries who have been involved in the development and operation of the International Space Station. This is the message I plan to share with our friends and partners next week at the International Astronautical Congress (IAC) in Jerusalem. Yesterday I joined the leaders of space agencies from around the world to talk about NASA’s Journey to Mars and the partnerships and cooperation that help make humanity’s common dreams a reality. Tuesday, I’ll join leaders of the Israeli Space Agency to sign a framework agreement to continue ongoing cooperation. It extends our decades-long relationship working together in Earth science, discoveries in space and new technologies. The late Israeli astronaut Ilan Ramon – who grew up about 50 miles from where we’ll be meeting – commented that “There is no better place to emphasize the unity of people in the world than flying in space. We are all the same people, we are all human beings, and I believe that most of us, almost all of us, are good people.” Having been blessed with the opportunity to see the Earth from space with my own eyes, I cannot agree more with this sentiment. NASA’s Journey to Mars is ongoing right now — from our Space Launch System rocket and Orion spacecraft to new propulsion and habitation systems – and our partnerships across sectors, across states and across the world make it stronger. This entry was posted in Uncategorized on October 13, 2015 by Lauren Worley. Supporting the People of South Carolina Posted on October 8, 2015 at 10:10 am by Karen Northon. The hearts of the entire NASA family go out to our friends, family, colleagues and countrymen and women in South Carolina. While the people of my home state have seen our share of tough times (including severe weather events), I cannot recall, in all my years growing up in the Palmetto State, rains and flooding as devastating as what has been going on this week. As a child of Columbia, I can personally attest to the fact that South Carolinians are resilient. As the people of the Palmetto State turn to the tough task of recovery and rebuilding, we hope that they will know that NASA is with them every step of the way – and we have been since the storm began. From the time the rain began to fall, our assets in space were watching it and our scientists were harnessing these unique capabilities day after day for weather forecasters and the emergency agencies dealing with the flooding and other impacts of the storm. NASA provided regular updates on the amount of rain falling across the region using data from the Global Precipitation Measurement (GPM) mission. Data from the GPM Core Observatory that we launched with the Japan Aerospace Exploration Agency (JAXA) last year is combined with rainfall estimates from a constellation of international satellites to provide rainfall totals every three hours. These data not only confirmed the record-breaking rainfall totals in the Carolinas, they helped forecast the extent of flooding in the region. Rainfall totals over the U.S. Southeast measured from space by the NASA/JAXA Global Precipitation Measurement mission aided weather forecasters and emergency agencies responding to extensive flooding in South Carolina.Image credit: SSAI/NASA/JAXA, Hal Pierce NASA provided the National Weather Service with detailed information about how water-saturated the ground was across the U.S. Southeast from the heavy rains – a key factor in forecasting flood conditions. Data from GPM and another NASA satellite, the Soil Moisture Active Passive (SMAP) mission, were combined in the NASA Land Information System model to produce experimental soil moisture estimates as a new piece of information for short-term flood forecasting. Maps of the location and severity of local flooding produced by a NASA-funded experimental modeling system at the University of Maryland were provided to Federal Emergency Management Agency (FEMA) to help identify hard hit areas across South Carolina. The system, fine-tuned with over a decade of previous NASA satellite precipitation data, used GPM data to estimate the intensity and location of floods every three hours. It will be a while before South Carolina recovers from the enormous rainfall and flooding. The loss of life and property is a heartbreaking outcome of this disaster that will take more than time to heal. I want everyone in South Carolina and other parts of the world threatened by natural disasters to know that NASA is dedicated to using our scientific ingenuity and innovative satellite resources to help inform response and recovery efforts on the ground. There are some who have suggested our country and our agency ought be doing less when it comes to Earth Science. When tragedies like these occur, I believe it’s a reminder that we ought be doing more. As we make advances in studying Earth’s climate, weather, oceans, ice caps, and land cover, that long-term effort of scientific discovery also yields benefits in improving our ability to respond to and recover from natural disasters. Today, Americans everywhere are thinking about our brothers and sisters in South Carolina. We know that the Palmetto State will recover stronger, just like we always have. This entry was posted in Uncategorized on October 8, 2015 by Karen Northon. Page 1 of 1212345...10...»Last »
科技
2016-40/3982/en_head.json.gz/3079
InsightsMethodology Interactive Stats AboutTechnology UK iOS 7 Usage Rates Higher than North America, Australia Slightly Behind UK iOS 7 Usage Rates Higher than North America, Australia Slightly Behind Chitika Insights previously found that iOS 7 users generate the vast majority of Web traffic from North American iPhones (89.7%) and iPads (84.8%). After examining iOS Web traffic within the UK and Australia, the results demonstrate the many similarities and few differences in terms of iOS version distribution between these regions and North America. To quantify this study, Chitika Insights analyzed millions of UK and Australian iOS-based online ad impressions generated within the Chitika Ad Network from May 22 through May 28, 2014, the same time period utilized for a previous study focused on North American iOS users. The results were then divided into iOS version distributions for iPhone and iPad users respectively. As seen above, the UK user base is on par with North America’s in terms of adoption of iOS 7 on iPhones. In both regions, 89.7% of iPhone Web traffic is generated by devices running iOS 7. In Australia, this figure is slightly lower at 86.3%. Looking at iOS 6 usage rates, a higher amount of Australian Web traffic (11.2%) is generated from iPhones running some version of iOS 6 as compared to what is observed in North America and the UK. The slightly greater shares for older iOS versions in Australia may be partially due to the much-publicized issues with Apple Maps in the country following the service’s debut in 2012. While Apple has addressed many of these problems in the following months and years, it’s possible that a small percentage of Australian users are still wary to upgrade to newer OS versions for this reason. When it comes to iOS version distribution for iPad users, the UK adoption rate for iOS 7 (87.0%) is higher than what is exhibited from the North American (84.8%) and Australian (83.3%) user bases. Much like the iPhone figures, iOS 6 drives a higher share of iPad Web traffic in Australia as compared to North America and the UK. Notably, iOS 5 or older iOS versions are better represented amongst U.S. and Canadian iPad Web traffic (7.3%) as compared to the UK or Australia, where the combined usage shares for those operating systems are 5.2% and 5.8%, respectively. While Apple has never broken out iPad sales by country, the original iPad, which is not compatible with iOS 6 or 7, was released in the U.S. a full month before it reached the UK or Australia. This likely means a greater number of those units were sold in North America, and are still in use by a comparatively larger portion of the user base considering the longer lifespan of tablets as compared to smartphones. Overall, the high iOS 7 usage rates between all studied geographies and device types point to Apple’s iOS update strategy paying dividends from an adoption standpoint across multiple regions. Additionally, app and mobile Web developers can take some solace in the noticeable similarities in iOS version distribution between North America and the UK – particularly in regards to iPhones. Regarding Australia, its differences from the other two studied regions are slight and may change, but future studies should provide a better indication as to whether these higher rates of older iOS version usage are an ongoing characteristic of the Australian iOS user base. Search Subscribe to the Chitika PulseOur newsletter delivers an overview of recent Chitika Insights reports right to your inbox! Connect With Chitika Search Chitika, Inc.257 Turnpike Rd, Suite 320Southborough, Massachusetts 01772 Google+ Copyright © 2005-2016 Chitika, Inc. All Rights Reserved. We collect information about your activities on certain websites to send you targeted advertisements. To opt out of Chitika's targeted ads, click here.
科技
2016-40/3982/en_head.json.gz/3098
Share 8 February, 2001 Taxi! to the programming future By NMA Staff Entranet's TAXI! is the world's first digital TV programme to combine segments of linear entertainment, video-on-demand and interactive shopping. Entranet, a London-based online commerce developer, is showcasing a fully interactive TV programme. The show, TAXI!, is touted to be the first digital TV programme in the world to combine segments of linear entertainment, video-on-demand and interactive shopping. E-commerce partners in the interactive show include financial services company Goldfish, Daimler Chrysler Smartcar, and Warner Bros Cinema. "This style of programming is a natural progression with the advent of technologies like TiVo and VoD, where people can take control of their viewing," says Paul Hastings, head of production at Entranet. "Advertising will become marginalized and more targeted in the future. TAXI! is an example of the way consumers can take control - choose how they want to enjoy a programme, and also choose commerce propositions along the way." The format of the show is an interactive city guide. The viewer is taken on taxi ride with a Spitting Image-style cabbie, whose mood the viewer can choose, selecting from happy, grumpy or rude dispositions. "This is an example of the way interactivity is built into the show from the beginning, not retrofitted," says Hastings. The viewer then gets to choose which area of the city - in this example London - that they want to explore, selecting from shopping, nightlife, arts, sights, or playing a competition about the city. For example, selecting the nightlife option takes the viewer into a Holiday-style featurette, showcasing London's theatreland, cinemas, bars and clubs. Incorporated into this are options for further mini-documentaries, about Andrew Lloyd-Webber for instance, and also commerce propositions and special offers, for example a 2-for-1 ticket deal at Warner Bros Cinema. "We have made the commerce less intrusive by offering sales for things when they occur naturally in the show," says Hastings. An iTV kiosk can be called up at the end of the nightlife section, in which viewers can buy tickets for West End musicals and other attractions using their remote control. Product placement is also used - when a BMW appears in shot, viewers can follow the call-to-action, click on their remote and enter a prize draw for the car. TAXI! is not currently being carried by any of the digital TV platforms, although Entranet is in currently in discussion with Sky and ntl. The programme has been transmitted through a Sky digital set-top box for demonstration purposes, but the current problem is the amount of bandwidth the show requires for its different options and programme strands. Through Sky Digital, the show requires 8-10 different channels. TAXI! is currently available on DVD and will soon be offered through a broadband Web site. The show is really being used as a calling card for Entranet's interactive programming admits Hastings. "The model of broadcasting is changing so quickly, we have to keep pace and anticipate the future. TAXI! is the culmination of two years worth of learning," he says. TAXI! is underpinned by OpenTV middleware, and requires around 11Mb per stream for broadcast quality delivery. Published 8 February, 2001 by NMA Staff 50335 more posts from this author NMA Archive The NMA Archive is the new home for new media age archived content.
科技
2016-40/3982/en_head.json.gz/3137
Payoff from Idling Coal Plants Overestimated, Researchers Say Four researchers from Carnegie Mellon University’s Green Design Institute discuss their more conservative estimates of greenhouse gas emission reductions in two papers this month. Oct 10, 2012 A quartet of researchers at Carnegie Mellon University's Green Design Institute conclude in two new papers that ignoring uncertainty in any coal-to-natural-gas transition for generating electricity can make a substantial difference in estimating the net environmental effect of the change. Researchers Aranya Venkatesh, W. Michael Griffin, H. Scott Matthews, and Paulina Jaramillo concluded life cycle assessment (LCA, the study of impacts that occur from cradle to grave) can be useful in these analyses. Their papers appear in the October issue of Environmental Research Letters and in Environmental Science and Technology, according to a news release posted by the university. While many studies simply examine different emissions from coal and natural gas plants, suggesting roughly a 50 percent reduction in greenhouse gas emissions, these researchers conclude the reduction is likely to be 7-15 percent instead because of changes in grid operation in response to price changes in natural gas. "As natural gas prices go down, it becomes cheaper to operate natural gas plants, and some of these plants start being operated more often. This results in some coal plants being operated less often. However, given certain technical constraints related to the operation of existing power plants, the displacement of coal-based generation is limited," said Jaramillo, an assistant research professor in CMU's Department of Engineering and Public Policy. To cut emissions by 50 percent using natural gas would require a significant retirement of coal plants and building new natural gas plants. The second paper examined the uncertainty in emissions that could be expected from retiring coal-fired power plants. While it suggests reductions in greenhouse gas emissions from limited retirement of coal plants will be minimal, emissions of sulfur and nitrogen oxides would be substantial, up to 25 percent, in some areas. (The paper focuses on up to 7 gigawatts of coal capacity being retired without building new power plants to replace them.) "We found that if expected coal plants retire, that alone will not bring us dramatic reductions in climate change inducing greenhouse gas emissions," said Matthews, a professor in CMU's Civil and Environmental Engineering and EPP departments. "In addition, the benefits achieved from reducing emissions of sulfur and nitrogen oxides, while substantial in aggregate measures, will not be evenly distributed; and while some counties will see reductions in the emissions of these criteria air pollutants, some counties will see increases," Jaramillo said.
科技
2016-40/3982/en_head.json.gz/3164
Up, up, and away By Sam Scott '96 Slow it down: The sail on NanoSail-D trims the speed and brings down the satellite sooner rather than later and cuts down on space junk. Courtesy NASA/Marshall Space Flight Center Share: Further nanosatellite adventures in the cosmos—with SCU students at Mission Control. Launching a 12-pound nanosatellite into orbit is a little bit like becoming the caretaker for a newborn baby. Suddenly you do things on its schedule, not yours. In the weeks after the O/OREOS satellite was detached from an Air Force rocket last November, students with the SCU School of Engineering Robotics Systems Laboratory had to be ready any time the satellite streaked overhead. Be it at 3 a.m. or 3 p.m., they were at Mission Control on the third floor of Bannan Engineering, furiously sending commands and checking vital statistics before the tiny vessel disappeared over the horizon, out of reach till the next pass. “You never know how things are going to act in space,” says Associate Professor Chris Kitts, director of the robotics lab. SCU is the only university in the country to let students do all mission operations and ground development for NASA satellites. Waking for satellites means a wearying schedule, doctoral student Michael Neumann ’03 says. But like any guardian, he found it a relief to see things are going well 400 miles above. The satellite, whose name is an acronym for Organism/Organic Exposure to Orbital Stresses, carried astrobiology experiments testing how microorganisms found in soil and salt ponds respond to solar ultraviolet radiation and other ardors of space. Results could help scientists with questions about the origin, evolution, and durability of life. The Small Spacecraft Division The flight was a joint effort between NASA/Ames’ Small Spacecraft Division, which built the 12-pound vessel, and Santa Clara, which managed it. Space missions are nothing new for Santa Clara’s Robotics Systems Laboratory, a magnet for undergraduate and graduate students eager for real-world, high-tech challenges in environments as diverse as deep lakes and outer space. For more than 10 years, engineering students involved with the lab have been designing, building, and controlling nanosatellites that are often times as small as a loaf of bread. The lab has been working with NASA since 2004. A cosmic tail In addition to the O/OREOS satellite, the Minotaur rocket that launched last November from Kodiak Island, Alaska, contained three more satellites with SCU connections. One of them, NanoSail-D, reported to SCU’s Mission Control, testing a novel way to force satellites into de-orbit—an important goal given the growing amounts of junk orbiting in space endangering other satellites. After reaching space, the NanoSail unfurled a 10-square-meter sheet of fabric no thicker than single-ply tissue to slow its speed. The rocket also contained two satellites operated by the University of Texas at Austin, using flight computers provided by the Santa Clara team to guide the satellites in formation flying. O/OREOS, though, was the satellite most entwined with SCU. In addition to operating Mission Control for months, students provided the satellite with its own way of de-orbiting. A satellite of O/OREOS’ size, altitude, and density would normally remain in space for more than 60 years before it burned up in Earth’s atmosphere, which is twice as long as NASA guidelines allow. So graduate student Eric Stackpole M.S. ’11 devised a spring-loaded, box-shaped tail that popped out of the satellite after O/OREOS reached orbit, increasing its surface area by more than 60 percent. The increased drag should gradually slow it down, hastening re-entry time for the satellite to less than 25 years. Stackpole’s device marked the first time NASA has used a propellantless de-orbiting mechanism on a scientific satellite. The next project will give the lab’s undergraduates a chance to show their power of design. In August 2012, NASA will launch a nanosatellite studying E. coli in space. SCU students are designing a low-power, low-cost mechanical way to point the satellite in a particular direction, necessary for communicating with Mission Control. “There is no other school that does mission operations for NASA the way we do,” says lab director Kitts, who started in satellite operations as an Air Force officer. “It’s really a student-centered operation.” SCU is the only university in the country to let students do all mission operations and ground development for NASA satellites, he says. Students developed the Mission Control center itself, and they wrote the software and operating procedures. OREOS satellite launch. Academics & Research
科技
2016-40/3982/en_head.json.gz/3253
Home / Toys EA Access Now Available For Xbox One! Pay Just $4.99/ Month For Free Games, Discounts Etc. Got an Xbox One or planning to get one? Then subscribe to Ea Access for just $4.99 to save on the purchases of games, play games for free and much more! Subscribe to EA Access @ EA.com here First of all, this is only valid for Xbox One. Second of all, if you have or are planning to get an Xbox One, then $4.99 is hardly a bank-breaking fee especially since you will end up saving so much more money in long term. An average game usually runs upwards of $70 but with EA Access, you gain access to select games per month with the vault, but there are other benefits as well. In fact, here are the three main advantages to getting to subscribing to EA Access: Once subscribed, you will gain access to the vault, which will allow you to play games for free and unlike PS Plus' rather mediocre line-up of games, which are mostly indie ones, EA Access is setting the bar much higher by promising only the best games like Madden NFL 25, FIFA 14, Battlefield 4 and Peggle 2 and many others to be added later on as well. Play For Less EA Access members can also save 10% on all digital purchases of games. Sadly, this doesn't apply to physical copies of games but with the way in which everything is transferring to cloud technology, I doubt many of you will mind, especially if you save money. Plus, most games tend to be cheaper digitally, so you might even be looking at save even more than just 10% in comparison to the retail price of physical copies. Play First With this benefit, you are given the advantage of downloading and playing games five days before they are released! This is definitely a major perk, especially for full-time gamers, but I'm sure any gamer would appreciate the extra time to learn the ropes of a new game and take a shot at setting high scores. Plus, it would give a major competitive edge over friends who would be playing the game on a PlayStation 4! ;) The only drawback is that you are only provided a limited amount of time in order to try the game, but it's still a neat feature to have nonetheless, especially if you have been eagerly awaiting the release of a specific game (i.e. NHL 15). Moosers, do you own an Xbox One? Will you be subscribing to EA Access? Let us know in the comments section! (Expiry: Never)
科技
2016-40/3982/en_head.json.gz/3307
How DNA finds its match IMAGE: This graphic shows DNA strung between two beads, which are held in position by laser. view more Credit: Stephen Kowalczykowski, UC Davis It's been more than 50 years since James Watson and Francis Crick showed that DNA is a double helix of two strands that complement each other. But how does a short piece of DNA find its match, out of the millions of 'letters' in even a small genome? New work by researchers at the University of California, Davis, handling and observing single molecules of DNA, shows how it's done. The results are published online Feb. 8 by the journal Nature. Defects in DNA repair and copying are strongly linked to cancer, birth defects and other problems. "This is a real breakthrough," said Stephen Kowalczykowski, professor of microbiology and co-author on the paper with postdoctoral researcher Anthony Forget. "This is an issue that has been outstanding in the field for more than 30 years." "It's the solution of one of the greatest needle-in-the-haystack problems in biology," said Professor Wolf-Dietrich Heyer, a UC Davis molecular biologist who also studies DNA repair but was not involved in this research. "How can one double-stranded DNA break find its match in an entire genome, five billion base pairs in humans? Now we know the fundamental mechanism," Heyer said. Forget and Kowalczykowski used technology developed in Kowalczykowski's lab over the past 20 years to trap lengths of DNA and watch, in real time, as the proteins involved in copying and repairing DNA do their work. The first step in repairing a damaged piece of normally double-stranded DNA by a process called recombination is to strip it to a single strand. That single-stranded DNA then looks for a complementary sequence within an intact chromosome to use as a template to guide the repair. How does a short, single-stranded piece of DNA find its exact matching partner out of perhaps millions of possibilities? In the 1970s, scientists discovered a protein, called RecA in bacteria and Rad51 in humans, which binds to the single-stranded DNA, forms an extensive filament and guides it to the right place in the chromosome. "This is a very important aspect of chromosome maintenance," Kowalczykowski said. "Without it, your genome will start to scramble very quickly." Defects in some proteins associated with DNA repair are associated with an increased risk of cancer - for example BRCA2, the breast cancer gene. But animals with defects in Rad51 don't even survive as embryos. But how this search for DNA sequence compatibility works has been unclear. The RecA/DNA complex has to bump into and sample different stretches of DNA until it finds the right one, but the number of sequences to search is huge - it's like finding the proverbial needle in the haystack. One model would be for RecA and its attached single-stranded DNA to slide along the intact duplex DNA until it gets to the right place. Or, if the DNA is in a coiled up form like a bowl of spaghetti, the RecA/DNA filament might be able to touch several different stretches of DNA simultaneously and thus shorten the time for the search. Forget set out to test these ideas by stretching single molecules of duplex DNA between two tiny beads to make a dumbbell shape. Both beads were held in place by laser beams, but one of the beads could be steered around using the laser. Then he added the RecA assembled on single-stranded DNA to the DNA-dumbbells and watched to see how well they attached to the target DNA when it was stretched out, or relaxed and allowed to coil up. "These are very complicated experiments to perform," Kowalczykowski said. They found that the RecA complex attached most efficiently to the target DNA when it was in a relaxed, coiled form. "The most efficient homology search is when the local DNA density is higher and the RecA-DNA filament can contact more areas of duplex DNA at the same time," Kowalczykowski said. "RecA doesn't slide along the DNA looking for a partner." Consider a bowl of spaghetti, Kowalczykowski said. If you were looking for one tiny region on just one piece of spaghetti in the bowl, you could grab several strands at once and quickly examine each. But if the spaghetti were stretched out in one long piece, you could only touch one part of one piece at a time. Kowalczykowski began working on the system for studying single molecules of DNA in 1991 with the late Ron Baskin, professor of molecular and cellular biology at UC Davis. In 2001, they demonstrated the technique by filming an enzyme called a helicase at work in real time unwinding the double helix of DNA. Since then, they have used the method to get new insights into the complex of proteins that copy and repair DNA. Kowalczykowski's lab was also one of two UC Davis groups to purify the protein made by the BRCA2 gene, strongly associated with breast cancer. BRCA2, it turns out, loads Rad51 - the human equivalent of RecA in bacteria - onto DNA to search the human DNA for the correct region to use for repair. ###The work was funded by the National Institutes of Health and the American Cancer Society. Andy Fell [email protected] @ucdavisnews http://www.ucdavis.edu More on this News Release National Institutes of Health, American Cancer Society DNA Dumbbells (IMAGE)
科技
2016-40/3982/en_head.json.gz/3311
Study of Andromeda's stellar disk indicates more violent history than Milky Way Survey data reveal a more disordered stellar population in our galactic neighbor than in our own galaxy, suggesting more recent bombardment of Andromeda by smaller galaxies IMAGE: This Hubble image of a crowded star field in the disk of the Andromeda galaxy shows that stars of different ages can be distinguished from one another on basis of... view more Credit: Ben Williams, PHAT collaboration A detailed study of the motions of different stellar populations in the disk of the Andromeda galaxy has found striking differences from our own Milky Way, suggesting a more violent history of mergers with smaller galaxies in Andromeda's recent past. The structure and internal motions of the stellar disk of a spiral galaxy hold important keys to understanding the galaxy's formation history. The Andromeda galaxy, also called M31, is the closest spiral galaxy to the Milky Way and the largest in the local group of galaxies. "In the Andromeda galaxy we have the unique combination of a global yet detailed view of a galaxy similar to our own. We have lots of detail in our own Milky Way, but not the global, external perspective," said Puragra Guhathakurta, professor of astronomy and astrophysics at the University of California, Santa Cruz. The new study, led by UC Santa Cruz graduate student Claire Dorman and Guhathakurta, combined data from two large surveys of stars in Andromeda, one conducted at the W. M. Keck Observatory in Hawaii and the other using the Hubble Space Telescope. The Spectroscopic and Photometric Landscape of Andromeda's Stellar Halo (SPLASH) survey has used the Keck/DEIMOS multi-object spectrograph to measure radial velocities of more than 10,000 individual bright stars in Andromeda. The recently completed Panchromatic Hubble Andromeda Treasury (PHAT) survey provides high-resolution imaging at six different wavelengths for more than half of these stars. "The high resolution of the Hubble images allows us to separate stars from one another in the crowded disk of Andromeda, and the wide wavelength coverage allows us to subdivide the stars into sub-groups according to their age," said Dorman, who is presenting her findings on Thursday, January 8, at the winter meeting of the American Astronomical Society in Seattle. The study presents the velocity dispersion of young, intermediate-age, and old stars in the disk of Andromeda, the first such measurement in another galaxy. Dorman's analysis revealed a clear trend related to stellar age, with the youngest stars showing relatively ordered rotational motion around the center of the Andromeda galaxy, while older stars displayed much more disordered motion. Stars in a "well ordered" population are all moving coherently, with nearly the same velocity, whereas stars in a disordered population have a wider range of velocities, implying a greater spatial dispersion. "If you could look at the disk edge on, the stars in the well-ordered, coherent population would lie in a very thin plane, whereas the stars in the disordered population would form a much puffier layer," Dorman explained. The researchers considered different scenarios of galactic disk formation and evolution that could account for their observations. One scenario involves the gradual disturbance of a well-ordered disk of stars as a result of mergers with small satellite galaxies. Previous studies have found evidence of such mergers in tidal streams of stars in the extended halo of Andromeda, which appear to be remnants of cannibalized dwarf galaxies. Stars from those galaxies can also accrete onto the disk, but accretion alone cannot account for the observed increase in velocity dispersion with stellar age, Dorman said. An alternate scenario involves the formation of the stellar disk from an initially thick, clumpy disk of gas that gradually settled. The oldest stars would then have formed while the gas disk was still in a puffed up and disordered configuration. Over time, the gas disk would have settled into a thinner configuration with more ordered motion, and the youngest stars would then have formed with the disk in a more ordered configuration. According to Dorman, a combination of these mechanisms could account for the team's observations. "Our findings should motivate theorists to carry out more detailed computer simulations of these scenarios," she said. The comparison to the Milky Way revealed substantial differences suggesting that Andromeda has had a more violent accretion history in the recent past. "Even the most well ordered Andromeda stars are not as well ordered as the stars in the Milky Way's disk," Dorman said. In the currently favored "Lambda Cold Dark Matter" paradigm of structure formation in the universe, large galaxies such as Andromeda and the Milky Way are thought to have grown by cannibalizing smaller satellite galaxies and accreting their stars and gas. Cosmologists predict that 70 percent of disks the size of Andromeda's and the Milky Way's should have interacted with at least one sizable satellite in the last 8 billion years. The Milky Way's disk is much too orderly for that to have happened, whereas Andromeda's disk fits the prediction much better. "In this context, the motion of the stars in Andromeda's disk is more normal, and the Milky Way may simply be an outlier with an unusually quiescent accretion history," Guhathakurta said. ### Other researchers who collaborated with Dorman and Guhathakurta on this study include Anil Seth at the University of Utah; Daniel Weisz, Julianne Dalcanton, Alexia Lewis, and Benjamin Williams at the University of Washington; Karoline Gilbert at the Space Telescope Science Institute; Evan Skillman at the University of Minnesota; Eric Bell at the University of Michigan; and Katherine Hamren and Elisa Toloba at UC Santa Cruz. This research was funded by the National Science Foundation and NASA. [email protected] @ucsc http://www.ucsc.edu More on this News Release National Science Foundation, NASA American Astronomical Society 225th Meeting Andromeda Star Field (IMAGE) view more Velocity Map of Andromeda Stars (IMAGE)
科技
2016-40/3982/en_head.json.gz/3386
Accelerating Science Discovery - Join the Discussion OSTIblog HomeTopicsAuthorsArchive Search James Van Allen – Space Pioneer 10 Jun 2016 Published by Kathy Chambers james_van_allen_wcaption.jpg James Van Allen’s space instrumentation innovations and his advocacy for Earth satellite planetary missions ensured his place among the early leaders of space exploration. After World War II, Van Allen begin his atmospheric research at the Johns Hopkins University Applied Physics Laboratory and Brookhaven National Laboratory. He went on to become the Regent Distinguished Professor and head of the University of Iowa (UI) Department of Physics and Astronomy. Drawing on his many talents, Van Allen made tremendous contributions to the field of planetary science throughout his career.Van Allen used V-2 and Aerobee rockets to conduct high-altitude experiments, but the lift was limited. He devised a ‘rockoon,’ a rocket lifted by hot air balloons into the upper atmosphere where it was separated from the balloons and ignited to conduct cosmic-ray experiments. The rockoon, shown with Van Allen in the image above, achieved a higher altitude at a lower cost than ground-launched rockets. This research helped determine that energetic charged particles from the magnetosphere are a prime driver of auroras. Read more... Thorium – An Element with Promise 09 May 2016 Published by Kathy Chambers winge_caption.png Thorium (232Th), the chemical element named after the Norse god of thunder, has a history that is as colorful as its namesake. Although discovered in 1828 by the Swedish chemist Jöns Jakob Berzelius, thorium had no known useful applications until 1885, when it was used in gas mantles to light up the streets across Europe and North America. Then in 1898, physicist Marie Curie and chemist Gerhard Schmidt observed thorium to be radioactive, and subsequent applications for thorium declined due to safety and environmental concerns. The scientific community would later find that the element thorium held promise for the planet to have clean, safe, cheap, and plentiful nuclear power as an alternative fuel to plutonium-based nuclear power plants. Read more... Climate Change Research 24/7 11 Apr 2016 Published by Kathy Chambers Image credit: ARM Program One of the research programs managed by the Department of Energy (DOE) is the Atmospheric Radiation Measurement (ARM) Program, created in 1989 to address scientific uncertainties related to global climate change. ARM's Climate Research Facility, a DOE scientific user facility, provides the world's most comprehensive 24/7 observational capabilities to obtain atmospheric data specifically for climate change research. The ARM facility includes fixed, mobile, and aerial sites that gather continuous measurements used to study the effects and interactions of sunlight, radiant energy, clouds, and aerosols and their impacts on the global climate system. The ARM program serves as a model and a knowledge base for climate change research endeavors across the globe. Read more... What is Scientific and Technical Information (STI)? 06 Apr 2016 Published by Judy Gilmore Scientific and technical information, or STI: It's in OSTI's name. It's in the language of our most recent statutory authority, section 982 of the Energy Policy Act of 2005: "The Secretary, through the Office of Scientific and Technical Information, shall maintain within the Department publicly available collections of scientific and technical information resulting from research, development, demonstration, and commercial applications supported by the Department." A DOE policy directive, DOE Order 241.1B, entitled "Scientific and Technical Information Management," requires DOE offices, contractors, and grantees "to ensure that STI is appropriately managed as part of the DOE mission to enable the advancement of scientific knowledge and technological innovation." As provided in the directive, OSTI spearheads the DOE Scientific and Technical Information Program (STIP), a collaboration of STI managers and technical information officers from across the DOE complex responsible for identifying, collecting, disseminating, and preserving the results of DOE-funded research and development (R&D). STI is the heart of OSTI and its mission. The STI that OSTI makes available is produced and published in a variety of media and formats. OSTI disseminates this STI publicly via a suite of web-based searchable databases featuring basic and advanced search capabilities, including semantic search, customized alerts, results displayed in relevance rank, in-document searching, and downloadable search results. SciTech Connect... Read more... OSTI Helping High Energy Physics Collaboration to Register Datasets 01 Apr 2016 Published by Sara Studwell The Department of Energy (DOE) Office of Scientific and Technical Information (OSTI) is working with a researcher in the High Energy Physics (HEP) community to register scientific datasets produced by a domain collaboration, a recent blog post has reported. OSTI offers a service for registering datasets to help increase access to digital data from DOE-funded scientific research. Through the DOE Data ID Service, OSTI assigns persistent identifiers, known as Digital Object Identifiers (DOIs), to datasets submitted by DOE and its contractor and grantee researchers and registers the DOIs with DataCite to aid in citation, discovery, retrieval, and reuse. OSTI assigns and registers DOIs for datasets for DOE researchers as a free service to enhance the Department of Energy's management of this important resource.
科技
2016-40/3982/en_head.json.gz/3400
http://www.reason.org/news/show/bipolar-bad-news-for-global-wa Bipolar Bad News for Global Warming Alarmists Global warming alarmists got some bad news from both poles recently:One: London’s Telegraph Christopher Booker reported yesterday that a research team jointly dispatched by the BBC and World Wild Life foundation to the North Pole expressly to measure how quickly the Arctic sheet is melting ran into just one problem: It found no evidence of melting. In fact, since last March, it seems that the ice sheet has thickened by at least half a meter.A tip-off that things were not going to turn out as anticipated came when the team, whose express mission was to raise awareness about global climate change ahead of the December confab in Copenhagen, saw wandering around aimlessly one of those polar bears who are supposedly near extinction due to global warming.Separately, Booker reported that a London employment tribunal ruled that a firm had wrongly dismissed from the position of “head of sustainability” someone who, in his fervent commitment to “climate change,” was trying to reduce the company’s “carbon footprint”. The tribunal chairman David Neath found the company guilty of discriminating against the employee under the 2006 Equality (Religion and Belief) Regulations, because his faith in global warming was a “philosophical belief”.“Recalling how ‘eco-psychologists’ at the University of the West of England are pressing for “climate denial” to be classified as a form of ‘mental disorder,’ writes Booker, “one doubts whether the same legal protection would be given to those who fail to share (the fired employees ‘philosophical beliefs.’”Two: A study in the journal Nature calculated that the time span for the Antarctic’s melting is not on the scale of a hundred years as alarmists have been hyperventilating, but thousands. The New York Times’ Andrew Revkin, no global warming skeptic, reports:“Dr. Pollard and Dr. DeConto ran a five-million-year computer simulation of the ice sheet’s comings and goings, using data on past actual climate and ocean conditions gleaned from seabed samples (the subject of the other paper) to validate the resulting patterns.The bottom line? In this simulation, the ice sheet does collapse when waters beneath fringing ice shelves warm 7 to 9 degrees Fahrenheit or so, but the process — at its fastest — takes thousands of years. Over all, the pace of sea-level rise from the resulting ice loss doesn’t go beyond about 1.5 feet per century, Dr. Pollard said in an interview, a far cry from what was thought possible a couple of decades ago.”So will President Obama please take a deep breath and exhale before committing the U.S. to an energy diet through a cap-and-trade scheme? Shikha Dalmia is Senior Analyst
科技
2016-40/3982/en_head.json.gz/3405
Does Stephen Hawking believe in God? In an interview with The Guardian in 2011, Stephen Hawking was dismissive of the view that there is a God. He said he believed that the human brain is like a computer, which dies after its components fail. In Hawking's opinion, there is no afterlife to look forward to. What did Stephen Hawking discover? What did Stephen Hawking invent or discover? In 2010, Stephen Hawking published a book, "The Grand Design," which laid out his thoughts on religion. In it Hawking said, "There is no need for a creator to explain the existence of the universe." Hawking was diagnosed with motor neurone disease more than 50 years ago, but he has since become one of the world's most noted physicists. Learn more about Judaism theguardian.com What are some Hebrew names for God? There are many different Hebrew names for God, but the most common are "Yahweh," "Elohim" and "Adonai." Each has a slightly different meaning, but all are ... What are the Hebrew names for God? Appearing over 6,000 times in the Tanakh, YHVH is the most common name of God in the Hebrew language. The Tetragrammaton is a combination of four Hebrew le... What has Stephen Hawking said about aliens? Stephen Hawking believes aliens are real, and humans should tread carefully when first meeting them. Mathematically, he believes there is a challenge in fi... What is Stephen Hawking's IQ? Stephen Hawking's IQ is not known for certain. Hawking has never been interested in how high his IQ is, but it has been estimated to be over 160. Hawking i... What are some promises from God found in the Bible? What is the Jewish hat called? What do Buddhists believe about God? Is Christmas mentioned in the Hebrew calendar? What is an example of a Yiddish expression? Who was God's angel Gabriel, according to Christian mythology? IQ of Stephen Hawking Stephen Hawking Biography Biblical Names of God 5 Proofs God Exists Names of God Meanings 72 Names of God Chart Is swordfish kosher? What does an anointing oil recipe contain? Whom do the Jewish people worship? What are rituals and practices of Judaism?
科技
2016-40/3982/en_head.json.gz/3506
Apple’s New iPad Costs at Least $316 to Build, IHS iSuppli Teardown Shows Apple’s new iPad hit store shelves today. That means that along with the lines at the stores and the requisite applause of store employees cheering people who buy them, there were among the many iPad buyers today people who just couldn’t wait to get the gadget torn apart. The analysts at the market research firm IHS iSuppli, considered by the investment community to be the most reliable of the organizations that conduct teardowns, were among that set. Today, somewhere in Southern California, an iSuppli analyst stood in line at a store and promptly took an iPad to a lab, where it was torn into, initiating the interesting process of estimating what it all cost to build. Here’s what iSuppli’s team found: First off, there weren’t many changes from the last iPad, in terms of suppliers. “It’s most of the same characters we saw last time around,” analyst Andrew Rassweiler told me today. Wireless chipmakers Qualcomm and Broadcom both reappeared — Qualcomm supplying a baseband processor chip, Broadcom a Bluetooth and Wi-Fi chip, TriQuint Semiconductor suppling some additional wireless parts. STMicroelectronics once again retained its position supplying the gyroscope. Cirrus Logic supplied an audio codec chip. The 16 gigabyte, Wi-Fi-only iPad that sells for $499 costs about $316 to make, or about 63 percent of the device’s retail price. On the upper end, the 4G-ready 64GB model that sells for $829 costs about $409 to make, or about 49 percent of the retail price. The new cost figures represent an increase of between 21 percent and 25 percent, depending on the model, from the iPad 2, which iSuppli tore down last year. So what did they find inside? An expensive Samsung display, for one thing. All those millions of pixels don’t come cheap. ISuppli analyst Andrew Rassweiler estimates that the display, which cost $57 on the iPad 2, has grown in cost to $87 on the latest iPad. Rassweiler says that two other vendors, LG Display and Sharp Electronics, have inked display supply deals with Apple for the latest iPad, but only Samsung is thought to have fully ramped up production. Depending on the vendor, the display may cost as much as $90, he said. One set of components remained essentially the same as before: Those that drive the touchscreen capabilities. Rassweiler says that three Taiwanese companies, TPK, Wintek and Chi Mei, supply parts related to driving the central interface feature of the new iPad, but he says to expect a major shift in how Apple handles the touch interface on future iPads. The combined cost of cameras, including the front-facing and back camera, is pegged at $12.35, more than three times the cost of cameras found on the iPad 2, Rassweiler says. But it’s essentially the same setup as that on the iPhone 4, he says. As has been the case with cameras, the identity of the supplier wasn’t easy to determine because they try hard to hide identifying information from the prying eyes of teardown analysts. The candidates, however, include Largan Precision Co., a Taiwanese supplier of camera modules to wireless phone companies, and Omnivision. On the iPhone 4S, a research firm called Chipworks identified the supplier of the CMOS sensor in one of the cameras as having come from Sony. As with other Apple devices, the main processor chip is an Apple-made A5X processor, one manufactured under contract by Samsung. The estimated cost of that chip is $23, up from $14 on the iPad 2. Another part that’s more expensive than on the last iPad, but also better for a variety of reasons, is the battery. This one is estimated to have cost Apple $32, up from $25 on the iPad 2. But it constitutes a significant upgrade, Rassweiler says, with 70 percent more capacity than before. Apple benefited in part by lower prices in the lithium polymer material used to make the battery, offsetting the cost of adding a vastly improved battery. ISuppli wasn’t the only outfit conducting teardowns of the iPad today. An enthusiast site called iFixit that encourages consumers to learn how to repair and upgrade their own electronics, flew technicians to Australia to conduct its own teardown analysis. Tagged with: Apple, Broadcom, chips, components, display, IHS ISuppli, iPad, iSuppli, manufacturing, Samsung, semiconductors, teardown
科技
2016-40/3982/en_head.json.gz/3520
National Aquarium and National Wildlife Federation Join Forces HomePress RoomPress ReleasesNational Aquarium and National Wildlife Federation Join Forces The National Aquarium and the National Wildlife Federation have joined forces to protect wildlife and water resources for future generations. Approved by unanimous vote at the most recent Board of Directors meeting, the National Aquarium has been selected as the National Wildlife Federation’s Maryland affiliate. This partnership will link conservation efforts from Appalachia, to the Chesapeake Bay, and the Atlantic Ocean. “This is a tremendous opportunity to align the efforts of this nation’s aquarium with one of its most effective conservation organizations,” said John Racanelli, CEO of the National Aquarium. “The National Aquarium team has worked tirelessly over the past 30 years to preserve and protect the Chesapeake Bay, in that time restoring 155 acres of bay shorelines with 1.4 million individual native plants, shrubs and trees. This exciting new alliance will allow us to further expand our reach and strengthen our impact.” “We are delighted to welcome the National Aquarium into the Federation’s family of 48 affiliates,” said Larry Schweiger, president and CEO of National Wildlife Federation. “The National Aquarium is the trusted voice of the aquatic world, filling visitors with a sense of wonder, educating them about the threats to our oceans and water resources, and inspiring them to take individual action.” Maryland is part of NWF’s Chesapeake Mid-Atlantic region, one of nine such regions throughout the United States. Affiliates in each region work together and with partners to advance conservation and protect the region’s unique natural treasures. The Mid-Atlantic region includes Pennsylvania, Delaware, Maryland, Washington D.C., Virginia, West Virginia and North Carolina. “The National Aquarium will be a great addition to our region-wide efforts to conserve our resources and to connect families with the natural world,” said Tony Caligiuri, NWF Mid-Atlantic regional executive director. “We’re already collaborating on important efforts to restore the Chesapeake Bay and look forward to working together ensure that aquatic habitats are preserved for future generations.” “Both of our organizations are dedicated to inspiring people to take an active role in protecting our natural resources,” said Laura Bankey, director of conservation at the National Aquarium. “We are excited about the national impact we will have by joining together to protect and restore our ecosystems.” National Wildlife Federation, founded 75 years ago, has 4 million members and supporters nationwide. Affiliate representatives elect the NWF Board of Directors and set the organization’s policy objectives in the form of resolutions. NWF has more than 82,000 members and supporters in Maryland. Conservation Go to Newsroom Chesapeake Bay The Chesapeake Bay is the largest estuary in the United States and one of the largest in the entire world. Learn About Our Efforts in the Bay
科技
2016-40/3982/en_head.json.gz/3547
Advertisement Home > Operations & Technology > Social media's impact: Your mistakes are public--and they live forever Social media's impact: Your mistakes are public--and they live forever Michele McDonald | ATWOnline EMAIL Comments 0 Advertisement If you disappoint your passengers, "it will be public, and it will live forever," Forrester Research analyst Henry Harteveldt said at the recent SITA IT Summit in Cannes, France. Harteveldt was talking about social media phenomena such as Facebook, YouTube, Flickr and Twitter, which allow travelers to broadcast their experiences immediately to hundreds, thousands, even millions of people. The reality of social media is that airlines must treat their customers as though the entire world is watching. A good chunk of it may be doing just that. Hartelveldt's presentation, titled "Keeping Mr. 22D Happy: What changing passenger behaviors and attitudes mean for air transport industry IT," included two photos posted by Flickr users. One depicted the airport in Baden, Germany, where, the poster said, "Ryanair gave me a birthday present on Sunday of an extra 5 hours and 40 minutes in Germany. The bad news is that I spent them at Baden airport!" The other memorialized in less-than-appetizing detail a breakfast served on a BMI transatlantic flight. An emphatic demonstration of Harteveldt's warning came a few days after he spoke in Cannes, when Canadian folksinger Dave Carroll posted a video on YouTube that details a battle with United Airlines in music and lyrics. Carroll's song describes his experience in March 2008, when he and his band traveled from Nova Scotia to Nebraska with a connection at Chicago O'Hare. While sitting on a plane in Chicago, he heard a passenger exclaim, "My God, they're throwing guitars out there!" He immediately alerted three employees that baggage handlers were mishandling expensive equipment, but none took any action. Sure enough, Carroll's guitar was severely damaged. For more than a year, Carroll tried to get United either to replace the guitar, pay for the repair or provide travel vouchers as compensation. Nothing happened until he posted his video, "United Breaks Guitars," on the evening of July 6. On July 7, United took action, and it, too, used social media -- in this case, Twitter -- to say, "This has struck a chord w/ us and we've contacted him directly to make it right." In another "tweet" on July 8, United said Carroll's video "is excellent and that is why we would like to use it for training purposes so everyone receives better service from us." But by July 10, the video had been viewed more than 1.38 million times, and more than 14,000 viewers went to the trouble of rating it (it got five stars). A Google search for the terms "Dave Carroll guitar United" returned 65,700 links. The video made the rounds on Facebook, too. Travel writer Peter Greenberg posted a story about it for his 1,652 friends. Carroll was invited to appear on CBS News' morning show, where he said that were it not for the video, he was sure United would not have gotten in touch with him. "They told me I wouldn't ever hear from them again." The moral of the story: Passengers are now armed with the ability to air their complaints instantaneously and globally. "Public relations no longer controls the message," Harteveldt said in his presentation. The message is viral, and it lives in cyberspace for a long, long time. The flip side: Social media can be used as a vehicle to reach out to customers, Harteveldt said. JetBlue and United, for example, alert their Twitter followers to short-term offerings. Even more intriguing is the ability to use Twitter to nip customer service issues in the bud before they become public relations nightmares. When a customer tweeted, "JetBlue, I need a wheelchair," she got an instant response from the carrier's Twitter account. "Twitter is becoming the customer service feedback loop," Harteveldt said. Print Please Log In or Register to post comments. Advertisement Related ArticlesSocial media's impact: Your mistakes are public -- and they live forever Study urges customer-centric view of irregular operations Social Distribution Google gives tips on airline innovation People Power Blogs & Commentary Sep 30, 2016
科技
2016-40/3982/en_head.json.gz/3574
ChemImage Sensor Systems selected to develop portable chemical detector Wednesday, Jul 23, 2014 @ 1:30pm by BioPrepWatch Reports ChemImage Sensor Systems (CISS), a subsidiary of ChemImage Corporation, announced on Monday that it was selected by the U.S. Army as one of the organizations to develop a portable chemical agent detection system.The U.S. Army Next Generation Chemical Detector (NGCD) program selected CISS and multiple other organizations to develop a portable system for the detection and location of chemical warfare agents (CWA) on environmental services. The work will be performed under the direction of the Joint Project Manager for Nuclear, Biological and Chemical Contamination Avoidance (JPM-NBC-CA), according to a CISS press release."We welcome the opportunity to show how hyperspectral imaging can perform reliable CWA detection and location with the goal of saving lives," Charles Gardner, the project manager for CISS, said.As part of the project, CISS will configure and evaluate its existing portable hyperspectral imaging technology for the detection of CWA at a breadboard level. The company will then progress the development of the system through brassboard and final prototype stages. The U.S. government will conduct rigorous testing in each phase to ensure the CISS system meets Army requirements."Recent events have shown us that CWA are still a very real threat in our world," Matthew Nelson, the chief scientist and business director at CISS, said. "CISS is excited about working with the U.S. Army to provide handheld instruments that dramatically lessen the warfighter impact of these terrible weapons of mass destruction."ChemImage Corporation, the parent company of CISS, develops hyperspectral and Raman chemical imaging technology.
科技
2016-40/3982/en_head.json.gz/3614
FiRe Website Strategic News Service StratNews.com The world's most reliable source of advanced information at the intersection of technology and economics. Recent & Selected Issues SNS Innovations Focus Channels Comp. & Comm. Econ & Fin. FiReFilms INVNT/IP Orca Relief Pattern Computer Project Inkwell Presidents’ Club Volume Licenses SNS Media FiRe Conference Media FiRe Conference Galleries Predictions Dinner Gallery SNS FiRe Speaker Series SNS Predictions Dinner Future in Review Conference SNS Predictions West About SNS Self-driven disruption By Arunabh Satpathy Everybody is talking about autonomous cars these days, but no one knows the exact contours of its effects. To answer that question, the FiRe 2016 conference brought together host Robert Anderson, Chairman and CEO, Hybrid Electric Vehicle Technologies and HEVT LLC and Craig Giffi, Vice Chairman, US Automotive Leader, and Deloitte LLP to discuss this world changing technology. Right off the bat, Giffi identified five “areas” that would be critical to the success of the autonomous vehicle. These five areas were entertainingly titled “my mother the car,” “what only goes up,” “Bill Clinton 1992,” “A Game of Thrones,” and “consumers are fickle. Anderson started by asking Giffi what he thought the biggest problems would be that were solved by autonomous vehicles. Giffi responded that safety would be the biggest part. He mentioned that over 35,000 annual highway fatalities exist, with over 94 percent attributable to human error. “The vision is these things never crash,” he said. “For society, the most obvious benefit is reducing the risk of a traffic fatality.” The session then returned to the five major areas to figure out. “My mother the car” refers to vehicle control and ownership, which ridesharing models and autonomous vehicles are increasingly challenging. Giffi was especially vehement in mentioning the second area, “what only goes up” i.e. regulation. He pointed to existing areas where Uber and Lyft cannot go, and predicted that certain factors inevitably cause uneven implementation. The third area, Bill Clinton 1992 played on the phrase “it’s the economy, stupid” by emphasizing “it’s the economics, stupid.” He said that in companies (especially entrenched companies) investing in autonomous vehicles, return on investment isn’t considered enough. He cited $8 — $12 billion invested in powertrains running on electricity, gas, diesel, and hybrids and new materials for fuel efficiency like graphene. He also said that the auto industry has led the way in diminishing returns. A 1x return is considered big in the auto industry, while Ford and GM getting 0.3x. “With all of the investment that is being put into new technologies, how in fact do the automakers or the industry get any ROI?” He put the smart money on disruptors and outsiders. He was also wary of the disruptive capabilities of the fourth area, titled “A Game of Thrones.” Continuing on the ROI point, he said insurance premiums will go down because of safer cars, and dealers will likely go out of business if the automakers or ridesharing companies control the business. He also mentioned massive worker disruption. “If this happens rapidly, it will be fairly catastrophic,” he said. “The disruption will be significant.” His final area was titled “consumers are fickle.” In this area, Giffi mentioned skepticism among American consumers in adoption of autonomous vehicles following the safety argument, while acknowledging that later generations like millennials and post-millennials would be far more receptive. “Consumers are slowly warming up to the notion of safety tech,” he said. He further elaborated that on average, the American consumer is willing to save less than $1000 on safety tech, whereas the investment is much higher, causing a major disjunct between investment and ROI. He ended the panel by saying that entrenched automakers would have to do the hard job of disrupting themselves, while newcomers would have to look at novel business models to monetize data from the experience. To discover more or read other articles from the conference, visit StratNews.com or our Medium blog. Share This Post: The CTO Challenge Team Reports Back By Melissa Dymock Team members of the 2016 CTO challenge reported back on their progress to the judges at the closing Friday session of FiRe Conference. The team had been tasked with building a flow computer system that can also measure the energy flow of the Earth. Nathanael Miller, an aerospace engineer at NASA and spokesperson for the team walked the judges through the work done so far. He said it was a “a treat for all of us to work through the night to get the presentation together.” Miller said that when analyzing the assignment placed before them, they changed the challenge from only being a flow system to also be an interactive system. Ben Brown, department head at Molecular Ecosystems Biology, explained the system at its most basic. The steps will include sensing the data, routing, aggregating, identifying a flow/pattern, and then using, interacting, and/or archiving the data. Franklin Williams, a principal at Live Earth Imaging Inc said that to make the system buildable in a timely manner, they suggested using sensors already in place. They would place their own aggregators on the sensors. “[There are] Millions of sensors out there,” he said. “We just need to pull them into the system.” Once they have data and flows, they can determine what holes exist in the knowledge. Then they can build their own sensors. Williams said that after the first iteration, they can drive the problem backward. Miller said they hope the system they design will allow variability in what it can do. It can be used by a kid in his garage or a Fortune 500 company. The judges were asked if they would vote up or down for this project. They all voted up except Ty Carlson, CTO at Coventry Computer, who voted up with an asterisk. “The impact that we have here is pretty significant,” Carlson said. “This is a human surveillance system that you have basically provided.” He listed many aspects affected by it, including: political systems that will resist, private systems that will be directly threatened, and there’s also the effect of people’s livelihoods. “Does everyone understand the significance of the design?” he asked. Miller said those were all things they were including in their discussions. “We want to make this data as useful as possible to better life on earth,” said David Zuniga, a commercial innovation manager at the Center for the Advancement of Science in Space. To discover more or read other articles from the conference, visit StratNews.com. Share This Post: Looking Further By Nick Fritz Eliot Peper is a former venture capitalist, strategist and currently a science fiction author of books like “Cumulous” and “Neon Fever Dreams.” In this session, Berit Anderson, CEO of Scout, discussed the real world inspirations for Peper’s books and his motivation to become an author. The discussion began with an explanation of Peper’s uncommon background and transition to being an author. Peper realized from his time in venture capital that there is a locus of human drama in that world that nobody was writing about. He felt Big type A personalities, high stakes deals, fortunes won and lost, and potentially world changing technologies make for juicy writing. “This is the book that I wanted to read, but nobody had written it,” he said. “So I did.” The discussion turned to the book “Cumulous,” set is a dystopian future world where a giant tech company governs the world. This tech company, although well meaning, has inadvertently created ubiquitous surveillance and crippling economic disparity. The theme of the book was inspired by the incredibly powerful social networking and software applications that are now being created more quickly than ever before. Further, it includes the suggestion that there may be very serious but unintended negative social externalities in this software age. Digging into that idea of externalities, Anderson asked Peper about the negative externalities that he sees playing out over the next 15 years. His answer was primarily geopolitical. He said that information has reduced the usefulness of national borders and has increased the porosity of these borders with respect to information flow, economics, and crime. Traditional governments are not well equipped to deal with this border porosity, and as a consequence private companies are stepping in to fill these skill gaps in areas where governments typically operate. He gave the example of Google’s Jigsaw, which is working in online crime prevention, radicalization, disruption, and protection of human rights, all functions typically filled by government entities. The interview ended with a discussion about Mars as another example of a private firm working in the traditional government sector. Peper spoke about the motivation for Mars colonization not only a as hedge against Earth, but also an intentional wake up call to think seriously about taking care of the Earth and to think disruptively about solutions. Following this discussion, it becomes clear that Peper’s transition from tech venture capitalist to science fiction is actually not much of a leap. In a world where the nature of international borders is changing through technology and a private company is planning to colonize Mars in the next ten years, Peper’s science fiction may not read much differently from the perspective of his former investment opportunities. Share This Post: A vision for the Congo The second breakout session on day three of the Future in Review 2016 conference was focused exclusively on Presidential candidate in the Democratic Republic of Congo(DRC) Emmanuel Weyi and the massive problems – political, social, and technological – that he faces in getting his country to “leapfrog” others into the 21st century. The panel was moderated by Weyi and Bruce Dines, VP, Liberty Global, and was attended by a combination of curious learners, educators, computer scientists, and entrepreneurs various cautious and optimistic about the DRC’s future. Dines spoke of access to education, and the opportunities for disruption in the DRC, particularlyt in telecommunications, which would become the two major themes of the session. “There’s an opportunity to leapfrog not only in terms of technology but also in terms of systems and processes,” said Dines. Attendee Nelson Heller mentioned the success of some technology in Africa, and how it faces resistance. One educator mentioned her experience in trying to introduce computer education in the United States, and the analogous resistance she faced among parents. The responses to the issues in education dovetailed around experiments in self learning for children, and identifying pain points of parents and specifically attacking them. The other large theme of the session was telecommunications. Weyi spoke as telecommunications as one of his priorities, with special emphasis on 3G connectivity. Narrating a person experience, he spoke of his days in a mining company where he had to wait for upto 2 days to speak with his employees, as he was in rural areas. Leapfrogging was brought up again, particularly by computer scientist Dallas Beddingfield. The discussion shifted to tech companies and their attempts to connect the world cheaply. Google’s and Facebook’s internet initiatives were mentioned as solutions. The political realities of Weyi’s path were also acknowledged, including the Belgian system based election process in the DRC, where a primary system similar to the USA is followed by a runoff in case a candidate doesn’t get 50 percent votes. This led to a discussion about Weyi’s motivations for visiting the US. He said his vision of the DRC involves tech, and the US is the center of it. “In the politics, there are two layers,” Weyi said. “The layer that everyone sees, and the one that no one sees.” He said his visits to US politicians and tech centers was his attempt to capitalize on the second layer, build connections, and create demand for his vision of the Congo. The final theme in the session was the environmental costs of rapid development. Weyi acknowledged that that the DRC has the second largest forest in the world after the Amazon, and the second greatest amount of biodiversity in the world. He lamented that some of this was threatened by Chinese logging activities. On the question of co-opting locals into the task of protecting the environment, Dines mentioned his work in the nature conservation and working to include local tribal populations into eco-tourism models. He cited the example of the Masai, who earlier fought over land, but now work together to benefit both tourists and the tribes. Weyi agreed and spoke of the need to bring people into the fold. “To bring change, you need involvement,” he said. “If you need change, you need to involve people.” Weyi spoke of his enthusiasm and faith in the youth of Congo, and how the educational infrastructure could be built by companies in exchange for advertising exposure. The session ended on a cautionary note, by acknowledging the massive difficulties in executing Weyi’s vision. Share This Post: Sensing Advanced Data Flows By Chance Murray This panel at the FiRe conference brought together experts from different fields like digitizing smells,quantifying underwater sound travel, and discovering seismic explosions. They were brought together by the world of sensors, which is reaching new heights and depths. “We’re trying to recreate a dog in a device,” said Chris Hanson, Founder and CEO of Aromyx. Aromyx’s technology digitizes smell and taste by creating sensors that mimic biosensors in the human nose and tongue. The application of such technology is significant. Bomb dogs at security points could be complemented or even replaced by devices employing such technologies. Noise pollution in the ocean has been on the rise to due increased movement of goods around the world. Roger Payne, Whale Scientist and Founder/President of Ocean Alliance, has been studying the physics of whale songs. “In an unpolluted ocean, we determined that whale songs can reach as far as 13,000 miles due to the unique physics of such ocean depths,” said Payne. Current sensors to monitor whale activity, commonly known as “critter cams,” provide only a static view of what whales see. Payne elaborated on a sensory device that would connect to whales and project a camera when sensors indicate another organism is nearby. This device would collect important information about the whales’ habits and environment, and would be powered by ocean currents rotating a small turbine incorporated into the device. John Delaney, Professor at the School of Oceanography, University of Washington, elaborated on a sensory device 400 KM off the Oregon coast that measures tectonic plate activity. The device sends 14 minutes of video, every 3 hours of each day. Among the highlights of data flow collected by the device is an underwater volcanic eruption, the audio of which Delaney played for the audience. “We’ve never heard anything like this before,” said Delaney. He then introduced a design for a series of sensors to be placed along the Pacific Rim that would provide flows of data regarding tectonic plate activity. The series would span the entire northwestern coast, a comparable technology to what Japan installed not too long ago. “We can choose to invest in these sensors now, enabling us to capture valuable information about tectonic activity in the coastal region, or choose to incur the cost of reconstruction after a significant earthquake,” said Delaney. Share This Post: Promises and perils at the Bleeding Edge of Healthcare By Shelby Cate As an industry that is often bemoaned, highly regulated and endlessly complicated, it is difficult to be an entrepreneur in healthcare. BBC’s Ed Butler once again hosted Oren Gilad, Don Straus, Shawn Iadonato, and Caitlin Cameron, the CEO’s of the FiRestarter companies Atrin Pharmaceuticals, First Light Biosciences, Kineta, and OtoNexus Medical, respectively, to discuss their experiences and the future of healthcare innovation. “It’s certainly a competitive landscape,” said Iadonato. “One thing that has evolved from the pharmaceutical industry is that they’ve largely gotten R&D, they are increasingly reliant on companies like my company, like [Gilad’s] company, to get the most attractive new technologies.” Cameron sees the same story in the medical device industry. “They look to young companies like us, but they want the angels and the ventures to take the initial risk,” she said. Straus wondered the effect this trend has had on new ideas, and said he has seen a lot of good ideas wither and die on the vine because they aren’t developed enough for venture capitalists and angel investors, and yet can’t move forward without development money. “Right now it feels like we’re on a great wave and things are working,” he said. But he also said he has had periods of “sitting around waiting for a wave and it feels like maybe nothing is coming.” Straus, whose company provides rapid diagnosis of infections also touched on the challenges that high profile failures on his industry, such as the recent Theranos bust. “I talk to angel investors and every other one asks me ‘why are you not Theranos?’” he said. “We’re not Theranos, we have something real.” According to Iadonato, the challenges of getting capital in pharmaceutical development in particular are that development is high risk, capital intensive, and takes a long time. This is in contrast to many technology ventures that are lower risk, require less investment and have quick development cycles. Gilad, however, took on a more hopeful perspective, pointing out that there are now venture arms associated with pharmaceutical companies that are on the ground in academic institutions looking for very, very early stage opportunities. “Capital is needed in the early stages where risk is high,” he said, “but it is out there.” All four CEOs acknowledged the ups and downs of the being on the bleeding edge of this industry, but also the excitement that comes along with it. “We’re part of this lunatic group of people that actually feel we can do something good,” said Gilad. “and it’s very fun.” Share This Post: Investing in Climate Change The evidence supporting climate change continues to rise, and the implications on environments and economics are too significant to ignore, according to Hans-Peter Plag, Director and Professor, Old Dominion University, Mitigation and Adaptation Research Institute. “Currently, emissions from worldwide annual energy usage is comparable to the Lake Toba explosion that occurred nearly 100,000 years ago,” Plag said. Compared to the former 100,000 years, the past 100 years have had climate metrics change drastically. Carbon Dioxide levels have risen, coastal zones have moved, and water temperatures have risen. Data suggests that a 1 degree celsius increase in global temperature equates to a 25 meter rise in sea level. The implications of Plag’s research are broad. City planners and environmentalists can take action to ensure coastal zones have adequate infrastructure and are free of waste and pollution that will wash into the ocean. Real estate developers can invest in properties outside exposed coastal zones or utilize mobile components that are capable of adapting to rising sea levels. “It’s time to divest of exposed coastal areas,” Plag said. “Or, if you build in the coastal zone, invest in mobile infrastructure that can relocate with rising water levels.” Share This Post: Invisible to you Katia Moritz, has been undiagnosed for a long time. The director of the documentary Undiagnosed was motivated by her own illness and those of people like her. She was joined on a panel by Tristan Orpin, EVP of Clinical Genomics, Illumina, John Ryals, CEO of Metabolon, and Robin Y. Smith, CEO of ORIG3N. The session hosted by Doug Jamison, CEO of the Harris & Harris Group. Upon introducing the panel, Sharon Anderson Morris spoke of “medical refugees” of undiagnosed diseases. The panel started with a clip of the film “Undiagnosed,” displaying the problems with people who have medically unexplained symptoms, and how they are often not treated for medical symptoms. However, the documentary has quickly led to the formation of the UnDx, a consortium of five tech companies with providers and patients brought together by Moritz. Moritz spoke of her personal story, and how she was working to give voice to the undiagnosed. She also spoke of the tragedy of not getting valuable data from people who the medical system refuses to treat and could offer something to the world. She got in touch with Dr. Isaac Kohani at Harvard and put together a database of undiagnosed. The companies’ offered their respective products, including genome sequencing by Illumina. With the gathered data and tech, they decided to continue looking for answers for families. Moritz also spoke with Jamison, who put her in touch with 5 biotech companies who set up the UnDx Consortium. “For the first time, all these companies who are working in different areas of biotechonology are working together,” he said. A big emphasis of the panel was on the fact that patients, providers, and companies were working together. The prevailing theme was “new technologies plus collaboration equals hope.” Taft spoke of his personal experience with a child of a friend with a family being undiagnosed. There was further discussion of inflection points, including the fact that ,any patients were not diagnosed for five to eight years, and many would never be diagnosed. Share This Post: Digitalization, the cloud, and the transformation of the 21st century The emerging trend of digitalization is blurring the line between the physical and digital world. The dramatic reduction in the cost of data collection, storage and analysis in the last several years has opened the door for this change, and it’s changing the nature of business. Greg Ness guided a discussion panel on the consequences of digitalization on Day 2 of the Future in Review 2016 conference. Preston McAfee, Michael Schwarz, Mark Sunday, Tim Fitzgerald, James Urquhart, and Edy Liongosari were also present as panel members. Their responses have been aggregated below. So what precisely is digitalization, and what does it mean for large enterprise? Its definition has changed over time. The dramatic reduction in the cost of data services is impacting the way that we conduct business. Furthermore, the dramatic change in connectivity is driving change. This ability to capture data in real time from multiple sources enables us to react in real time. This increased flow of data between the physical and digital world is at the core of digitalization. This increased flow has the potential to drastically change industries, some perhaps more than others. Potentially there is no limit to which industries can apply this idea. Agriculture is one example, where the monitoring of moisture levels can dramatically reduce water usage and increase relative yields. The mass collection of data will allow for macro-analysis, which can the be micro-targeted down to individuals based on their specific needs. These sort of “personal plans” will permeate many industries. Digitalization will also change the organizational structure of firms. To be used effectively, digitalization efforts will have to be embedded in all functional areas of a business, not siloed in one department. The bottom-line is this: digitalization is happening. Those firms that choose to get in front of the wave will prosper. This begs the question: Who is working in this space now? General Electric is a great example of a firm who is adapting well. Seemingly overnight they transformed from a hardware company to a software company, and are now collecting enormous amounts of data on their equipment. Another consequence of this flow is cloud computing, which is being used to allow firms to fail fast in innovation. In this environment, the slow movers will be damaged quickly. Perhaps more quickly than ever before. It’s important to remember that although digitalization may spell big changes for the way that companies do business, it’s likely that consumers will not experience life changing effects. Something that cloud computing allows is collaborative filtering, whereby data sets and patterns are developed by millions of users but are accessible at a personal level. This is the biggest change for consumers. Cloud computing and mobile connections enable this. Furthermore, machine learning applications in voice, picture, and video digitalization are changing the applications of cloud computing. One form of digitalization not often mentioned is the digitalization of human assets. Through this process, it will become possible to select an ideal candidate for a job based on their digital profile, or to select the best customer type from a group of potential customers. Additionally, this process may have a “flow” effect, whereby the network created by human digitalization will allow us to seek out particularly useful contacts for a project or position. This digitalization may drive longitudinal change through the rest of the decade. Thought to text and universal language translation will may be the biggest change makers, probably by the end of the decade. These technologies will change the nature of human interaction and increase the digitalization speed of human assets tremendously. The organizational structure of firms may begin to change as well. Conway’s Law states that systems developed by organizations tend to mirror the communication practices of that organization. Digitalization will possibly reverse this trend, and organizational structures may begin reflecting the nature of the digital communication protocol. Share This Post: Population Flows Populations, like data, capital, and intellectual property, flows across borders. There are many drivers of population, which vary with time and region. This breakout session, hosted by Mike Winder, explored the nature and drivers of population flows in the 21st century. Labor is one critical driver of current population flows, as demonstrated by the immigration debate here in the US. However, the nature of work in the future may be fundamentally different than it has been for the last hundred years. Automation is replacing human labor in positions across all industries. This displacement of traditional roles, which required attendance, by new service-based roles, which may be accomplished remotely, will necessarily change the dynamics of population flow with respect to labor. Other drivers of population flow may be environmental. War, poverty and famine are historically common reasons for human migration. However, in the future, environmental collapse may be a driver of population flow. China, particularly in highly industrialized and populated areas such as Beijing, is already experiencing acute pollution issues that are causing real human suffering. Populations will continue to flow as they always have. The drivers may be different in the 21st century, and understanding these trends will be critical to unlocking value in the coming hundred years. Share This Post: View More Posts » Subscribe Partner with SNS © 2016 Strategic News Service LLC FiRe, FiReFilms, Strategic News Service, Project Inkwell, and INVNT/IP are registered trademarks of Strategic News Service LLC All other trademarks cited here are the property of their respective owners.
科技
2016-40/3982/en_head.json.gz/3621
Tag: automation New Surveillance Program Listens For Gunshots, Get Police There in Minutes By Veronique Greenwood | May 30, 2012 12:09 pm These days, our artificial ears and eyes are better than ever—and more ubiquitous than ever. A business recently profiled by the New York Times seems to embody both what’s most promising about such pervasive surveillance and also what’s potentially disturbing. ShotSpotter sells and helps run an automated gunshot-reporting system to police departments, for a cost of $40,000 to $60,000 per square mile. Recording equipment is installed in neighborhoods and linked software that records sounds that could be gunfire, analyzes them to identify which are actually shots, and then submits its findings for review by a trained employee in the company’s Mountain View office. If a human verifies that the sounds are indeed gunfire, the police are notified with the location of the shots, pinpointed to within 40-50 feet. All this can happen in well under five minutes, meaning police can be there right away. CATEGORIZED UNDER: Technology MORE ABOUT: automation, crime, guns, surveillance Google Tries to Jump-Start the Driverless Car, But Big Questions Loom By Veronique Greenwood | May 23, 2011 4:17 pm What’s the News: Google’s self-driving cars have been generating buzz lately, with the news that the company has been lobbying Nevada to allow the autonomous vehicles to be operated on public roads. But it remains to be seen whether hordes of self-driving cars really going to work in the real world. CATEGORIZED UNDER: Technology MORE ABOUT: automation, driverless cars, ethics, Google, robots, vehicles Google's Self-Driving Cars Are Cruising the California Highways By Jennifer Welsh | October 11, 2010 11:56 am Google announced this weekend that it has been driving automated cars around California’s roads, and that the vehicles have already logged about 140,000 miles. A fully automated car just finished a big trip–all the way from Google’s campus in Mountain View, California to Hollywood. Larry and Sergey founded Google because they wanted to help solve really big problems using technology. And one of the big problems we’re working on today is car safety and efficiency. Our goal is to help prevent traffic accidents, free up people’s time and reduce carbon emissions by fundamentally changing car use. [Official Google Blog] A Google car drives with the help of a variety of sensors–including cameras on the roof and in front, radars, and laser range finders–which build a detailed map of the car’s surroundings. This information is transmitted to the Google servers and processed to detect and react to any obstacles that get in the car’s way, mimicking the decisions a human driver would make. CATEGORIZED UNDER: Technology MORE ABOUT: automated driving, automation, cars, Google, google car NEW ON DISCOVER
科技
2016-40/3982/en_head.json.gz/3631
Weekly Wire: The Global Forum Submitted by Roxanne Bauer On Thu, 05/15/2014 Tweet WidgetLinkedin Share ButtonAdd comment These are some of the views and reports relevant to our readers that caught our attention this week.Most Of What We Need For Smart Cities Already Exists The compelling thing about the emerging Internet of Things, says technologist Tom Armitage, is that you don’t need to reinvent the wheel — or the water and sewage systems, or the electrical and transportation grids. To a large degree, you can create massive connectivity by simple (well, relatively simple) augmentation. “By overlaying existing infrastructure with intelligent software and sensors, you can turn it into something else and connect it to a larger system,” says Armitage.Mideast Media Study: Facebook Rules; Censoring Entertainment OK PBS Media Shift A new study by Northwestern University in Qatar and the Doha Film Institute reveals that Middle Eastern citizens are quite active online, with many spending time on the web daily to watch news and entertainment video, access social media and stream music, film and TV. “Entertainment Media Use In the Middle East” is a six-nation survey detailing the media habits of those in Qatar, Egypt, Lebanon, Tunisia, United Arab Emirates (UAE) and Saudi Arabia. The results of the survey, which involved 6,000 in-person interviews, are, in part, a reflection of how the Internet has transformed Arab nations since the Arab Spring. More than ever, consumers in the Middle East/North Africa (MERA) region are using technology to pass along vital information, incite social and political change, become citizen journalists and be entertained. Global Information Technology Report 2014 The Global Information Technology Report 2014 features the latest results of the NRI, offering an overview of the current state of ICT readiness in the world. This year’s coverage includes a record number of 148 economies, accounting for over 98 percent of global GDP. The 13th edition of The Global Information Technology Report is released at a time when economies need to solidify the recovery of the past year and leave the worst financial and economic crisis of the past 80 years behind. Developed economies need to sustain their incipient economic recovery and find new areas of growth and employment creation; emerging and developing economies need to build their resilience against turbulence in the markets and foster their innovation potential in order to sustain the rapid economic growth they experienced in the past decade. Africa Focuses on Building Resilient Cities Developing viable, livable and resilient cities is increasingly seen as being of critical importance to giving opportunities to Africans to improve their lives. While rural development has long been a priority for governments and development agencies – with particular emphasis placed in recent years on Africa growing enough food to feed all her people – recent research is making a case for more attention to be given to urban areas. Already, more than half the world's people live in cities – as opposed to a tenth a century ago – according to United Nations statistics quoted by The Rockefeller Foundation's "resilient cities" initiative. And UN Department of Economic and Social Affairs figures quoted by Hannah Gibson of the Africa Research Institute in London suggest that Africa will be 50 per cent urban by the early 2030s and 60 per cent urban by 2050. What Do New Price Data Mean for the Goal of Ending Extreme Poverty? Every country in the world has a poverty line—a standard of living below which its citizens are considered poor. Typically, the richer the country, the higher the poverty line is set. The average poverty line of the poorest 15 countries in the world is used to define the global extreme poverty line—a minimum standard of living that everyone should be able to surpass. In 2005, this global poverty line was set at $1.25 per person per day. The Millennium Development Goals set out to halve the share of people living below this global minimum by 2015, and the successor agreement on sustainable development goals promises to finish the job and end extreme poverty everywhere by 2030. How feasible is the goal of ending extreme poverty, and where is poverty concentrated? 'Cow will make your baby fat': breaking food taboos in west Africa "A pregnant woman should not eat cow. The child will be fat," said one respondent during research carried out on nutritional taboos among the Fulla people in the Upper River region of the Gambia. In comparison to the rest of western Africa, WHO classifies the Gambia's malnutrition rates as moderate. Nevertheless, the World Food Programme (WFP) is currently providing assistance to 12,500 pregnant and nursing mothers and 50,500 children in the Upper River region by distributing cereal (rice and millet) each month. Nutritional taboos can hamper NGOs' hunger and malnutrition relief efforts. The issue is even more of a concern during humanitarian crises, when food supplies are at a critically low level and people are likely to lack nutrients and be more susceptible to disease.Follow PublicSphereWB on TwitterPhoto credit: Flickr user fdecomite Tags: Weekly WirenutritionfoodTaboosMillennium Development GoalssoftwareinfrastructureICT ReadinessWorld Economic ForumFacebookmediaCensorshipUrban DevelopmentInformation and Communication TechnologiesMiddle East and North AfricaAfrica
科技
2016-40/3982/en_head.json.gz/3644
| Politics | Odd News Education IPCC’s warning about ice disappearing from mountain tops based on student dissertation By ANI Sunday, January 31, 2010 LONDON - In what may cause fresh embarrassment to the Intergovernmental Panel on Climate Change (IPCC), it has emerged that its warning about ice disappearing from the world’s mountain tops was based on a student’s thesis and an article published in a mountaineering magazine. Earlier, the IPCC had to issue a humiliating apology over its inaccurate claim that global warming will melt most of the Himalayan glaciers by 2035 was based on a “speculative” article published in New Scientist. In its recent report, IPCC stated that observed reductions in mountain ice in the Andes, Alps and Africa was being caused by global warming, citing two papers as the source of the information. However, it has emerged that one of the sources quoted was a feature article published in a popular magazine for climbers which was based on anecdotal evidence from mountaineers about the changes they were witnessing on the mountainsides around them, The Telegraph reports. The other was a dissertation written by a geography student, studying for the equivalent of a master’s degree, at the University of Berne in Switzerland that quoted interviews with mountain guides in the Alps. After the surfacing of the fact that IPCC has been using unsubstantiated claims and sources for its warnings, sceptics have cast doubt over the validity of the IPCC and have called for the panel to be disbanded. “These are essentially a collection of anecdotes. Why did they do this? It is quite astounding. Although there have probably been no policy decisions made on the basis of this, it is illustrative of how sloppy Working Group Two has been,” Professor Richard Tol, one of the report’s authors who is based at the Economic and Social Research Institute in Dublin, said. “There is no way current climbers and mountain guides can give anecdotal evidence back to the 1900s, so what they claim is complete nonsense,” he added. However, scientists from around the world leapt to the defence of the IPCC, insisting that despite the errors, the majority of the science presented in the IPCC report is sound and its conclusions are unaffected. (ANI) Tags: London Discuss Bellow
科技
2016-40/3982/en_head.json.gz/3646
The Digital Skeptic: Will.i.am Brands Show Just How Hard It Is To Get Paid Written by: Jonathan Blum AAPL KO NEW YORK ( TheStreet) -- Andrew Smits is a big fan of Will.i.am. But when it comes to the gadgets the musician-slash-entrepreneur is spinning up, we agree they probably "Will.not.work." "The youth market is not the wide open blue water people think it is," Smits told me a few weeks back over drinks and finger food. Smits is no idle fan boy. He's creative director of Concept 73 , a San Clemente, Calif.-based marketing agency with serious chops selling action brands such as Rip Curl , Simple Mobile and Vans to kids. He and I were at an industry confab trying to get our brains around how celebs such as Will.i.am., the Black Eyed Peas producer and early collaborator on the Beats by Dre line of headphones, are swimming into ever-odder consumer electronics waters. Late last year Will.i.am rolled out what had to be the oddest device of recent memory: a clip-on smartphone camera lens case called the foto.sosho. The unit is designed, built and wholesaled by Will.i.am -- or William Adams, as he is known to his parents and the IRS. And it slots over an iPhone to extend its imaging features. High-end U.K. retailer Selfridges & Co. retails it for a stout roughly U.S. $480. Wider global release is expected later in 2013. And no question, Mr. Adams is a legit master in cross-platform media jujitsu needed to get such a device off the ground. "I travel a lot. I'm sponging all the time. I am a 'popthroplogist,'" he joked during the International Consumer Electronics Show as he explained his vision of entrepreneurship on stage during an event at the Las Vegas Hilton. Back in 2003, he got it that the Black Eyed Peas track Hey Mama was hotter tied to a hip iPod commercial than as mere music on radio or the Web. He was early to leverage music's brand power to spin up electronics brands such as Beats by Dre, which sold to recently to HTC for $300 million. Will.i.am evens sells his insights to the Fortune 1000. Intel (INTC) , Dell (DELL) and Coca-Cola (KO) take serious bets based Will.i.am's advice. Witness Coke's launch of the Ekocycle brand , which supports products made from recycled material. Fortune magazine went so far as to plop Will.i.am on its January cover as "Corporate America's Hit Machine." "Starting up a product was easier than I thought," he told the crowd. That all may be true. But one does not need to be a new-media Will.i.am to see the chance that an iPhone add-on -- or the larger trend of betting on said celebrities as gadget rainmakers -- making any real money is almost incomprehensibly small. "Will.i.am is definitely a success story in making money in the music business," Smits said. "But selling pricey iPhone parts, that's going to be a challenge."
科技
2016-40/3982/en_head.json.gz/3647
Google Hits ‘Glass’ Pedal as Apple Returns to Earth By Sam Gustin @samgustinFeb. 26, 2013 Share Carlo Allegri / REUTERSGoogle co-founder Sergey Brin wears Google Glass before the Diane von Furstenberg spring/summer 2013 collection show during New York Fashion Week on Sept. 9, 2012 Email The U.S. technology industry is one of the most dynamic in the world, particularly with respect to mobile and Internet-based computing, two areas that are evolving at breakneck speed. Things can happen very quickly in the tech space: one day you’re up, the next day you’re down. Take Apple and Google, two tech titans currently battling for dominance in the mobile-Internet wars. Over the past several months, Google shares have increased by nearly 20% — last week topping $800 — while Apple shares have fallen by more than 30%. Much of the movement happened in the past few months of 2012, as large investors, including hedge funds, pulled money out of Apple and, in some cases, poured it into Google, in order to maintain exposure to the large-capitalization technology sector, according to Colin Gillis, senior technology analyst and director of research at BGC Financial. “As Apple started selling off, Google started taking off,” Gillis says in a phone interview. “If you’re an investor and you want exposure to large-cap tech stocks, there aren’t that many places you can go.” The Apple sell-off is being driven in part by growing concerns about whether products like the iPhone and the iPad — devices that Apple is only incrementally improving — can continue to power revenue and profit growth, or whether Apple needs new, breakthrough products. After all, during his legendary career, Apple’s late co-founder Steve Jobs radically disrupted several markets with iconic products like the iPod and iTunes, and the iPhone and iPad, which set the standard for tech innovation. Current Apple CEO Tim Cook has yet to introduce a truly breakthrough new product of his own. (MORE: Is Apple Losing Its Shine After Steve Jobs?) “Tim Cook keeps alluding to the company’s great product pipeline, but there’s been an innovation vacuum for a couple of quarters,” says Gillis. “I’m not going to say the story is over — let’s give it one more year — but we’re certainly in a period of incrementalism with Apple.” Scott Kessler, head of technology research at S&P Capital IQ, also raised the issue of incrementalism in Apple’s product cycle. “There are some well-founded concerns about the company’s ability to innovate, especially in light of Steve Jobs’ passing,” Kessler says in a phone interview. “It’s not just about the next big thing, but the next big category. People have been looking for new products and new categories for some time, but they haven’t seen them.” It doesn’t help that Apple has experienced several quarters of slowing growth, which has further spooked investors. Last quarter, Apple generated profit of $13.1 billion, but that was flat compared with the year-ago period — the company’s lowest rate of profit growth in a decade. Google, by contrast, continues to report solid growth thanks to its dominant search engine and online advertising business. Last quarter, net income increased 13% on revenue of $14.42 billion, a 36% increase over one year ago. Google has now jumped ahead of Apple as the most widely held long technology hedge-fund position, according to Goldman Sachs’ new Hedge Fund Trend Monitor report, which analyzed 725 hedge funds with $1.3 trillion in gross assets. (MORE: How Google’s Chief Innovator Sergey Brin Is Making Science Fiction Real) In short, Apple expectations are returning to earth. “Apple has had a tremendous run from 2001 until the end of last year,” says Kessler. “People want the company to invent a new category. In the past, they’ve done that so frequently and successfully that when they don’t seem to do it as much or as profoundly, questions arise.” Meanwhile, Google is hot. For example, Google’s new Chromebook Pixel laptop is garnering positive reviews. (“Thank you, Google. For obsoleting my MacBook,” as one CNET writer put it.) And the company’s Google Glass wearable computing project — high-tech Internet-connected specs — is generating the sort of buzz usually reserved for Apple products. The futuristic eyewear will be available to consumers by the end of 2013, just in time for the holiday shopping season, according to several reports. Here’s the latest official Google Glass video: The Google buzz has been further amplified by chatter about the tech giant’s massive, 42-acre (17 hectare) expansion to its Googleplex headquarters at NASA’s Ames Research Center in Silicon Valley. That’s a convenient location, because it’s right next door to NASA’s Moffett airfield, where Google executives keep no fewer than six private planes, including a 757, a 767 and several Gulfstream jets, according to a report last year from NBC Bay Area. (Cash-flush Apple also has an ambitious new headquarters due in 2016 under development.) “Google is getting a lot of attention and a lot of kudos for taking risks and trying something new,” says Kessler. “While Apple is reducing screen size [see the iPad Mini], Google is introducing a whole new product with wearable technology, which reinforces the perception that it’s being innovative.” (MORE: Why Is Apple CEO Tim Cook Sitting Next to Michelle Obama?) As for Apple, it’s telling that Cook has been on something of a p.r. tour in recent months, appearing on the cover of Bloomberg BusinessWeek and showing up for a rare on-camera interview with Brian Williams of NBC News. In an apparent attempt to burnish the company’s image, Cook recently announced plans to spend at least $100 million to “do one of our existing Mac lines in the Unites States.” (To put that into perspective, Apple made over $50 billion in profit over the past 12 months.) And earlier this month, Cook sat with First Lady Michelle Obama during President Obama’s State of the Union address. But Cook is going to need more than high-profile appearances if he wants to restore Apple’s mojo. Incremental updates to existing product lines are well and good, but investors — and consumers — are looking for the company to unveil truly disruptive new products, as it did with the iPod, iPhone and iPad. There has been chatter that Apple might introduce a new TV product, or perhaps a “smart watch,” but thus far those are merely rumors. It’s time for Apple’s next revolutionary product to become a reality.
科技
2016-40/3982/en_head.json.gz/3653
Leigh Kish Carnegie Museum of Natural History 412.622.3361 (0), 412.526.8587 (C) [email protected] BugWorks explores the anatomy of insects and how these physical forms work A collaboration between Carnegie Museum of Natural History and Carnegie Mellon University School of Design Pittsburgh, Pennsylvania…Observe insects up-close, and see the world through the eyes of insects to figure out how bugs work. Now open at Carnegie Museum of Natural History, BugWorks provides the rare opportunity to get up-close and personal with a few creepy crawlies and the Carnegie scientists who study them, through large-scale photographs, models, video, specimens, and illustrations, and terraria of live insects. BugWorks is a collaboration between Carnegie Museum of Natural History’s Center for Biodiversity and Ecosystems and Carnegie Mellon University (CMU) School of Design, and features exhibition elements designed and developed by CMU students as part of their senior capstone course. This collaboration and BugWorks are funded by The Fine Foundation and the Henry Lea Hillman, Jr. Foundation. BugWorksfocuses on the anatomy of various insects. For example, the lubber grasshopper has very long hind legs, which gives it the unimaginable ability to jump distances up to 20 times the length of its body. Examine the forms of different body parts to determine the functions or abilities of that particular bug, much like scientists do. Learn about the wide-ranging “jobs” a bug can do, from pollination to decomposition, and see how diverse, and sometimes downright bizarre, the insect world can be. The stars of the exhibition are alive: a giant water bug, an Emperor scorpion, darkling beetles, Allegheny crayfish, a young tarantula, and some lubber grasshoppers. Other highlights include:Large-scale Gigaprints of insects—incredibly detailed, high-resolution photographsSix enclosures of live bugs going about their daily activitiesSpecimens from the museum’s own bug collectionsVideo vignettes of Carnegie scientists behind-the-scenes in the insect collectionAn interactive photo booth to snap a picture with a favorite insect—Bring your camera! Bug blueprints that illustrate the forms of insect anatomyPhotographs from a bug’s-eye view, showing environments from the perspective of insectsTake-home collection cards of western Pennsylvania insectsMap of western Pennsylvania for visitors to show what bugs they’ve seen near their homesBugWorks is on view through July 28, 2013, and is free with museum admission.Center for Biodiversity and EcosystemsCarnegie Museum of Natural History’s Center for Biodiversity and Ecosystems engages scientists from collaborating institutions worldwide to understand, manage, and sustain the health of local and global ecosystems. It utilizes the museum’s vast collection and the environmental research center Powdermill Nature Reserve as a living laboratory for ecological research and as a site for visiting researchers and educators studying the mid-Appalachian ecosystem. The Center creates interdisciplinary research and educational projects that address some of the most pressing scientific questions of our time: questions regarding changes to the environment—past, present, and future—and how these changes affect nature and human cultures. The Center for Biodiversity and Ecosystems launched in January 2011 and is under the direction of John Rawlins. Carnegie Museum of Natural History, one of the four Carnegie Museums of Pittsburgh, is among the top natural history museums in the country. It maintains, preserves, and interprets an extraordinary collection of 22 million objects and scientific specimens used to broaden understanding of evolution, conservation, and biodiversity. Carnegie Museum of Natural History generates new scientific knowledge, advances science literacy, and inspires visitors of all ages to become passionate about science, nature, and world cultures. More information is available by calling 412.622.3131 or by visiting the website, www.carnegiemnh.org.
科技
2016-40/3982/en_head.json.gz/3788
The "National Climate Service" scam By Alan Caruba web posted January 21, 2002 The one thing you learn as you follow the activities of the environmentalists devoted to the biggest hoax of the modern era, Global Warming, is that they are relentless in their devotion to pursuing the hidden agenda of "climate control." It isn't about the climate at all and never has been. It is about crippling the economy and, thereby, the hegemony of the United States. Yes, "hegemony", because we are without doubt the most powerful nation on the face of the Earth today and a lot of people really hate our devotion to capitalism, to the workings of the free market, to property rights, to guaranteed political freedoms, and the free flow of information. Some of them fly hijacked commercial jets into the symbols of our power, the World Trade Center and the Pentagon. Others seek to undermine our Constitution by luring us into international treaties that have noble-sounding names, but which serve to strip us of the control our government has over our landmass, our natural resources, and other vital aspects of our lives. That has been the purpose of the United Nations Climate Control Treaty, otherwise known as the Kyoto treaty. Even Japan, where the treaty was first proposed, has recently rejected many of its provisions. There's nothing like a recession or depression to get governments and people focused on how to pay the bills. The United States, to its credit, rejected the Kyoto treaty outright as totally bogus and an attack on the stability and maintenance of our economy. Of course, not everyone in government rejected it. The treaty's primary defender and proponent was then-Vice President Al Gore. He was supported in this by former President Bill Clinton. Not surprisingly, Republican-turned-Democrat, Jim Jeffords of Vermont, is also a rabid environmentalist. If you think that environmentalists-Greens-have given up on curbing our use of energy to keep our computers goes, heat our homes, run our automobiles, fuel our industries, and generally insure that everything functions as intended, you are very wrong. The list of environmental organizations fill fat directories. There are thousands of these enemies of prosperity, health, and common sense buried in the ranks of government at all levels. They find their greatest support from the Democratic Party. My friend, David Wojick, who writes for Electricity Daily, recently alerted myself and others to a plan to create a "National Climate Service" in the Commerce Department. This nugget is tucked away in the huge Senate energy bill, S-1766. Now, we're not talking about the National Weather Service or the National Oceanographic and Atmospheric Administration. No, this new agency would exist to predict the weather ten years, fifty years, a hundred years from now! At this point, you probably think I'm just making this up. After all, if there is anything we know for sure, it is that even the National Weather Service with the most sophisticated computer models and tracking technologies often gets a prediction for the next day's weather wrong. If it gets it right predicting what the weather will be five days from now, we think they are doing a dandy job. The National Climate Service, we are supposed to believe-and fund-will have the capability of predicting what the weather will be decades, even centuries, from now. This is the United Nations Climate Control Treaty all over again! It is the deliberate lie that current computer weather models have the capability of predicting such distant events or trends. It is the Global Warming gangsters doing what they have been doing since they stopped predicting a coming Ice Age in the 1970s! Another friend of mine, Christopher C. Horner, an expert on the way the Greens are perfectly willing to break in the backdoor if they can't get through the front one, points out that "it irrefutably is not possible to trace climate activity to man" and that "one volcanic eruption dwarfs anthropogenic (human) activity, exposing the follies of regulating energy use on the basis of man-as-weather-machine." Horner, writing in The Washington Times, said, "Sane policy-making dictates ceasing all machinations toward CO2 (carbon dioxide) reductions and their related economic impact until science answers the threshold question: What concentration level constitutes 'dangerous interference'?" Good question. In the era of the dinosaurs, there was tons more CO2 in the Earth's atmosphere and they thrived for a hundred million years! The National Climate Service, proposed in Senate bill S1766, is intended to ignore real science and substitute the outright lies perpetrated by Greens who are devoted to advancing the Communist goal of global government. To achieve this, they came up with the ultimate scare, Global Warming. The proposed new agency would be charged with the responsibility of "developing assessment methods to guide national, regional, and local planning and decision making" based on its utterly, totally bogus predictions. That's why this latest break-and-entry by the Greens is so dangerous. If the National Weather Service cannot predict next week's weather with any certainty, why would this nation want to predicate major policies on the predictions of a National Climate Service? The answer is that this is preposterous! Preposterous, too, are claims about CO2 levels in the atmosphere as a danger to humanity or claims that the Earth is even experiencing a warming. It is not. It has not warmed at all in the passed half century since the 1950s. The Bush administration and the Republican Party have got to make a clean break with the environmentalists and their endless, deliberate falsification of science. This goes way beyond mere politics. This nation has declared and is waging war on Islamic terrorists around the world. It is time to do the same with the Green Taliban whose purpose is to destroy our economy and undermine our national sovereignty. Alan Caruba is the founder of The National Anxiety Center, a clearinghouse for information about scare campaigns designed to influence public opinion and policy. The Center maintains an Internet site at www.anxietycenter.com. E-mail ESR Conservative Site of the Day Home � 1996-2013, Enter Stage Right and/or its creators. All rights reserved. You've seen the banner, now order the gear! Visit ESR's anti-gun control gear web site for T-shirts, mugs and mousepads!
科技
2016-40/3982/en_head.json.gz/3790
NCPA's Energy and Environment Blog Environment and Energy Insights from NCPA's E-Team Environment Hub Japan’s Lesson: It’s Time to Deal with Spent Nuclear Fuel By Sterling Burnett Filed under Energy on March 24, 2011 with 7 comments There have been three main barriers to the construction of new nuclear power facilities: high construction costs, concerns about plant failure leading to a meltdown, what to do with the spent nuclear fuel (usually called waste). The second problem has been brought to the fore with the crisis at Japan’s Fukushima nuclear plant resulting from the horrific earthquake and subsequent Tsunami. Even if the leaked radiation doesn’t ultimately result in significant illness or loss of life (and of course I hope it doesn’t), the questions raised by the still ongoing problems at this plant have only increased fear of nuclear power and almost certainly the costs involved in developing and operating an new facility. Since costs are already steep compared to other alternatives for electric power production it is doubtful more than a few of the nuclear plants currently in planning or development will be constructed in the next decade (and maybe ever in their current form). Whether or not we ever build or operate any additional nuclear power plants in this country, the third issue, what to do with the spent fuel, remains. As David T. Stevenson, Director of the Center for Energy Competitiveness at the Cesar Rodney Institute notes, despite all that has been reported about the problems with the multiple failing reactors at the Japanese plant, the most troubling and immediate potential hazard stems from the loss of water cooling the plant’s stored spent fuel rods. Stevenson states that, “The nuclear crisis in Fukushima, Japan shows, beyond a doubt, the time has come to open existing, secure nuclear storage facilities in the United States to avert a similar tragedy. Stored fuel is the biggest concern in Japan. We currently store spent nuclear fuel rods at power plants in above ground facilities in secure Transportation, Aging, and Disposal Canisters (TAD). These canisters can be shipped and stored without opening them. There are currently about 71,000 metric tons of spent fuel and high level radioactive waste stored at 121 nuclear power plants and non-military government sites. All of this waste, minus shipping containers, could be stacked forty-one feet high on one football field.” Stevenson proposes three solutions: Storage at Yucca Mountain, Storage at the Waste Isolation Pilot Plant (WHIP), and recycling. Interestingly, these are the same three solutions I examined in a paper released March 2010. I agree with Stevenson, the time for talk is past, now is the time to either start shipping spent nuclear fuel to the permanent storage facilities which science has already demonstrated time and again to be safe, or to recycle the spent fuel for continued operation of currently existing facilities and to reduce the overall waste stream that ultimately needs to be stored. As Stevenson, explains, both the money and the facilities exist to handle spent nuclear fuel — all that has been lacking is the political will to act. Hopefully, Japan’s nuclear crisis will serve as a forceful prod getting U.S. politicians to act. The Premature Promise of Renewables Texas is Right to Fight EPA Taxes, Tea Parties and the Environment Environmental Policy: Assessing the past year, immediate needs for the New Congress Renewable Energy Standard: Not conservative, not good for national security If you enjoyed this article, subscribe to receive more great content just like it. Simon says: March 28, 2011 at 11:40 am Great post–which of the two options is more cost-effective over time? How often would there be a need to transport waste? Larry Harmon says: March 28, 2011 at 1:24 pm Sterling: Great article. I enjoyed talking to you about a month ago. Keep up the great work. Alexis says: April 1, 2011 at 8:44 am Interesting article, these are all great solutions and important to discuss. Jean says: April 4, 2011 at 12:56 pm Interesting article. We’ll have to wait and see what Washington ultimately decides to do. A shame that it takes travesties to shock them into concern, and even still, there is no guarantee that they will act. Hugh says: May 28, 2011 at 3:33 pm More could be said about the history of recycling spent nuclear fuel. The USA had the lead in recycling spent control rods in the 1970s until Pres. Jimmy Carter issued an executive order closing the recycling facilities. France recycles all spent nuclear fuel. The result is a small volume of less “toxic” waste that can easily be stored. Japan followed our lead on nuclear energy so part of the problems now are the unintended but predictable consequences of Jimmy Carter’s executive order. Much of the problem in Japan involved the overheating of stored spent fuel rods. I don’t think we can wait for Washington to make a decision without being pressured by citizen groups. They don’t have the knowledge or the “cojones” to make the decision themselves. Big D says: October 25, 2013 at 11:09 am Good article highlighting the immediate effects of cronyism and radiophobia. Recycling nuclear waste isn’t done partly because it’s cheaper to use fresh U than reprocess the “spent” fuel rods. The “spent” fuel rods have 95+% of the U still there–the Pressurized Water Reactors now in service are that inefficient. Meanwhile, engineering of Liquid Fluoride Thorium Reactors (LFTRs) based on research done at Oak Ridge in the 60’s is proceeding–in China. LFTRs have the ability to consume the “nuclear waste” Oh by the way, LFTRs are safe (incapable of leaking, melting, or exploding), efficient (99% of actinides consumed), clean (no emissions, no CO2, tiny amt of short-lived waste), economical (after we defeat the regulatory and lawfare juggernauts fueled by radiophobia) and useful beyond electicity (process heat, desalinization, fuel generation, medical isotopes). NO CO2 watch or download godzilla says: April 19, 2014 at 6:38 pm I really like it when folks get together and share ideas. Great website, continue the good work! Take a look at my web page watch or download godzilla Air Problems Crime and Guns Rare Earths Regulation and Risks energy electricity coal natural gas petroleum oil renewable environmental benefit environmental footprint federal regulations federal subsidies gasoline tax highway trust fund human emissions Regulation EPA Electricity Energy state subsidies About the NCPA The National Center for Policy Analysis (NCPA) is a think tank, established in 1983, that develops and promotes free-market reforms in health care, taxes, retirement, education, energy and the environment. Read more about our ideas in energy and environment. Recent Posts New York Primary “Fracking” Fight Available and Off-Limits Offshore U.S. Oil and Natural Gas Resources Oregon & EPA Launch Aggressive Moves Against Coal Utah Joins Oklahoma in Rejecting Clean Power Plan Obama Announces $98.1 Billion More Transportation Spending Waste April 2016 March 2016 February 2016 January 2016 December 2015 November 2015 October 2015 September 2015 August 2015 July 2015 June 2015 May 2015 April 2015 March 2015 February 2015 January 2015 December 2014 November 2014 October 2014 September 2014 August 2014 July 2014 June 2014 May 2014 April 2014 March 2014 February 2014 January 2014 December 2013 November 2013 October 2013 September 2013 August 2013 July 2013 June 2013 May 2013 April 2013 March 2013 February 2013 January 2013 December 2012 November 2012 October 2012 September 2012 August 2012 July 2012 June 2012 May 2012 April 2012 March 2012 February 2012 January 2012 December 2011 November 2011 October 2011 September 2011 August 2011 July 2011 June 2011 May 2011 April 2011 March 2011 February 2011 January 2011 December 2010 November 2010 October 2010 September 2010 Blogroll Energy Townhall Planet Gore Blog
科技

[中文主页]

Industry models play a crucial role in driving enterprise intelligence transformation and innovative development. High-quality industry data is key to improving the performance of large models and realizing industry applications. However, datasets currently used for industry model training generally suffer from issues such as insufficient data volume, low quality, and lack of domain expertise.

To address these problems, we constructed and applied 22 industry data processing operators to clean and filter 3.4TB of high-quality multi-industry classified Chinese and English language pre-training datasets from over 100TB of open-source datasets including WuDaoCorpora, BAAI-CCI, redpajama, and SkyPile-150B. The filtered data consists of 1TB of Chinese data and 2.4TB of English data. To facilitate user utilization, we annotated the Chinese data with 12 types of labels including alphanumeric ratio, average line length, language confidence score, maximum line length, and perplexity.

Furthermore, to validate the dataset's performance, we conducted continued pre-training, SFT, and DPO training on a medical industry demonstration model. The results showed a 20% improvement in objective performance and a subjective win rate of 82%.

Industry categories: 18 categories including medical, education, literature, finance, travel, law, sports, automotive, news, etc. Rule-based filtering: Traditional Chinese conversion, email removal, IP address removal, link removal, Unicode repair, etc. Chinese data labels: Alphanumeric ratio, average line length, language confidence score, maximum line length, perplexity, toxicity character ratio, etc. Model-based filtering: Industry classification language model with 80% accuracy Data deduplication: MinHash document-level deduplication Data size: 1TB Chinese, 2.4TB English

Industry classification data size:

Industry Category Data Size (GB) Industry Category Data Size (GB)
Programming 4.1 Politics 326.4
Law 274.6 Mathematics 5.9
Education 458.1 Sports 442
Finance 197.8 Literature 179.3
Computer Science 46.9 News 564.1
Technology 333.6 Film & TV 162.1
Travel 82.5 Medicine 189.4
Agriculture 41.6 Automotive 40.8
Emotion 31.7 Artificial Intelligence 5.6
Total (GB) 3386.5

For the convenience of users to download and use, we have split the large dataset into sub-datasets for 18 industries. The current one is the sub-dataset for the technology industry.

Data processing workflow:

image/png

Downloads last month
1,826

Collection including BAAI/IndustryCorpus_technology