Wednesday, December 21, 2011

Skype Delivers Free Holiday Wifi to Fifty US Airports

The folks over at Skype are providing a welcome gift to weary travelers this holiday season. Unwrap an hour of free wifi access in more than fifty airports across the US. 
According to this announcement:
"From December 21st thru December 27th, travelers passing through or delayed in over 50 airports across the country will be able to access third-party hotspots using Skype WiFi and connect with loved ones via a Skype video or voice call for free."
To take advantage of this offer, you need to have the most recent version of the Skype client installed on your PC or Mac. After you sign into your Skype account, just check your wireless connection to verify you are in a supported location.

Grab your copy of Skype for Windows here, for Mac here and for Linux here.
Skype will brighten the holidays for travelers in major airports in New York, Chicago, Miami, Denver, Burbank and San Francisico, among many others.
Skype Wifi customers can also take advantage of this special holiday gift. Skype Wifi is a service which offers paid access to over a million wifi hotspots worldwide. 
After your trip, you can share your free holiday wifi experience at #freeskypewifi on Twitter


Article first published as Skype Delivers Free Holiday Wifi to 50 US Airports on Technorati.

Friday, November 18, 2011

Mars Science Lab 'Curiosity' Will Search for Signs of Life

'Curiosity' Rover will Search for Signs of Life
NASA is preparing to launch another rover mission to Mars on November 25, 2011. The goal of this mission will be to search for signs of past life on the Red Planet.

According to NASA's website,
"...Curiosity has 10 science instruments to search for evidence about whether Mars had environments favorable for microbial life, including the chemical ingredients for life."
Dubbed 'Curiosity', this seven foot tall rover is twice as big as previous mars' rovers and weighs over a ton. It carries more than ten times the mass of scientific equipment than the Spirit and Opportunity rovers launched in 2004.

Spectacular Mars sunset
Propelled by an Atlas V rocket, the ambitious mission will last two years and focus on the Gale crater. Also known as the Mars Science Laboratory, this rover will carry more scientific equipment than has ever been sent to another planet.

Curiosity will attempt a first ever, multi-stage precision landing. It will use a combination of a supersonic parachute deployed at approximately 10 km (6.2 miles) to slow decent of the spacecraft, eight directional rocket thrusters which will allow controllers to adapt to the environment and steer the craft towards the landing area. Finally, a 'sky crane' will gently lower the rover to the planet surface.

Mars Gale Crater
Gale crater is believed to be about three and a half billion years old and 154 km (more than 95 miles) in diameter. This location was chosen for the rich combination of morphologic and mineralogical evidence of water in Mars' past. It contains minerals that are conducive to fossil preservation and the crater provides a surface that Curiosity can navigate safely. The choice was designed to "...identify a particular geologic environment, or set of environments, that would support microbial life."

This mission comes on the heels of an attempt by Russian scientists to land a probe on one of Mars' moons, Phobos, and return samples to Earth. The spacecraft is currently trapped in low Earth orbit and scientists are struggling to restart booster engines before it falls back to Earth containing tons of unspent rocket fuel.

Watch the launch live plus other special events, documentaries, news conferences and much more at NASA TV.

images: NASA / JPL-Caltech
rover: http://www.nasa.gov/mission_pages/msl/index.html
gale crater: http://en.wikipedia.org/wiki/File:Gale_crater.jpg
Martian sunset: http://upload.wikimedia.org/wikipedia/commons/thumb/6/69/MarsSunset.jpg/998px-MarsSunset.jpg


First published as Mars Science Lab 'Curiosity' Will Search for Signs of Life


Google Defies RIAA Over Download App

Google defies RIAA
A battle of behemoths is brewing as Google goes toe to toe with the RIAA (Recording Industry Association of America) over the availability of the app "MP3 Music Download Pro" in the Android Market which could be used to download copyrighted music.

Google contends that the app can also be used to download legal files and has so far refused to remove it from the Android Market.

The RIAA sent a takedown notice to Google in August over the app, which the RIAA said, "...is clearly being used for illegal purposes, and Google responded that they were declining to remove it from the Android Market."

Downloader App Screenshot
Google has removed apps from the market in the past that could be used for obtaining copyrighted music but an RIAA spokesperson complained that often the same or similar apps re-appear a few days later and, "...too many apps created to harvest links to unauthorized files remain available and popular on the Android marketplace, resulting in widespread infringement of copyrighted works."

In May, 2011, Google's Chairman Eric Schmidt called a proposed law, the  'Protect IP Act', "A disastrous precedent for Free Speech". This legislation, designed to combat offshore servers and endorsed by the RIAA, requires Search Engines and DNS servers to remove links and make targeted websites 'disappear' from the Internet.

Rumors persist that Google is working on agreements with the major record labels and plans to offer music purchases through the Google Music Store, similar to the successful Apple iTunes Music Store.
Downloader Pro Barcode

Google is building anticipation for a November 16, 2011 event "These Go To Eleven", an homage to the old 'This Is Spinal Tap' flick. There is speculation the announcement will include music purchases through the new music store and sharing services through Google+.

The RIAA has declined to say if they plan on filing suit against Google for facilitating copyright infringement. If they choose to take Google to court, the RIAA may find Google to be a tougher opponent than the thousands of private citizens the RIAA has been suing for several years.

First published as Google Defies RIAA Over Download App

Monday, November 14, 2011

Crippled Russian Spacecraft Carries Toxic Payload

Russian spacecraft Phobos-Grunt
Russian scientists are struggling to restart engines in the Phobos-Ground Probe  which failed to leave orbit following launch on November 8, 2011. The ship, containing tons of dangerous rocket fuel, could drop out of orbit and fall to earth as soon as a few days from now or could linger in decaying orbit until around Christmas next month.

Tons of unspent rocket fuel and an uncontrolled re-entry could create a very dangerous combination depending on where the craft comes down and the state of the fuel.

If the fuel does not freeze, but remains liquid, it should burn up harmlessly before it reaches the ground. But if the toxic mixture freezes, it could survive re-entry and strike the Earth intact. According to James Oberg, formally with NASA and now a space consultant,
“About seven tons (6.4 tonnes) of nitrogen tetroxide and hydrazine, which could freeze before ultimately entering, will make it the most toxic falling satellite ever”
Signs of trouble began when Russian scientists lost contact with the probe and requested amateur astronomers worldwide to report any sightings. The craft was spotted by South American astronomers trapped in a low orbit, trailed by its' failed booster engines.

Software engineers have attempted upgrades, bug fixes and reboots of the system, but according to this report, they have so far been unable to communicate with the craft and hope is fading that a solution will be found before the batteries fail.
Mars Moon Phobos
 Engineers are hoping that if the spacecraft falls back to Earth, it will land in the ocean. Compared to the six ton UARS Satellite, which dropped out of orbit in September, 2011, this failed spacecraft and booster engines weighs over fourteen tons, most of which is unspent fuel.

The ship was headed for Mars moon Phobos on an ambitious mission  to return to Earth next year with about seven ounces of soil samples from the Martian moon. Instead, this is the just latest and most dangerous, in a very long string of failed missions to Mars by the Russian Federal Space Agency.

Article first published as Crippled Russian Spacecraft Carries Toxic Payload on Technorati.

images:
phobos ground probe: http://commons.wikimedia.org/wiki/File:Cebit_2011-fobos-grunt_together_with_upper_stage.jpg
phobos: http://commons.wikimedia.org/wiki/File:Phobos_PIA10369.jpg 

Thursday, November 10, 2011

Adobe Quits Mobile Flash Development

Adobe will no longer support mobile Flash
Adobe has announced they are dropping development of the popular Flash plug-in for mobile browsers. They will continue to provide security updates and critical bug fixes for existing hardware and software configurations including Android and Blackberry Playbook.

Adobe is expected to axe 750 jobs as it shifts focus from mobile Flash development to aiding designers and developers of mobile Flash apps to migrate to open HTML5 using Adobe Air runtime which supports a wide variety of plug-ins and platforms.  Adobe stock dropped 10% by 9:30 am following the announcement.

Flash was developed over a decade ago and designed with the desktop PC in mind. The shift to smaller, mobile devices has highlighted some fundamental problems with Flash: Notoriously high cpu usage which drains batteries and causes problems with overheating. There are also problems with apps designed for a pointy-clicky mouse verses the modern touchscreen interface.

HTML5 provides audio, video and other feature rich browser components with lower cpu usage and supports the capacitive touch-sensitive screens used in these smaller, low power devices such as phones and tablets.

According to the Adobe Blog,
"We will no longer continue to develop Flash Player in the browser to work with new mobile device configurations (chipset, browser, OS version, etc.) following the upcoming release of Flash Player 11.1 for Android and BlackBerry PlayBook. We will of course continue to provide critical bug fixes and security updates for existing device configurations."
Flash is pervasive on the web, to both the delight and frustration of mobile phone and tablet users. The move to HTML5 is better suited for this new generation of small, low powered devices. With more than a third of the worlds top 100 websites already implementing HTML5, this paradigm shift has already begun.

Creative Commons images:
flash on droid: http://www.geeky-gadgets.com/adobe-to-give-free-android-phones-to-its-staff-03-05-2010/
html5: http://www.w3.org/News/2011#entry-8992


Article first published as Adobe Quits Mobile Flash Development on Technorati.

Tuesday, November 8, 2011

Flurry of Earthquake 'Anomolies' Caused by Fracking?

Seismograph
Update: March 9, 2012:
A series of Ohio earthquakes were caused by fracking according to State officials. Tough new laws will regulate drilling and disposal of waste.*

The 5.6 earthquake which struck Oklahoma in late 2011 was the latest in a string of 'rare' earthquakes shaking the normally geologically quiet interior or North America.

These previously unusual events are raising fresh concerns that the use of fracking and the disposal of fracking fluids may be causing the quakes.

Fracking is the fracturing of layers of rock with high pressure fluids to release pockets of trapped gas and oil. Increases in energy prices and Bush era deregulation has led to a wide use of this technique with thousands of new wells brought on line in the past few years.

This proliferation in fracking has coincided with numerous earthquakes in unusual locations. Recent studies indicate that culprit may actually be the disposal wells. Fracking produces lots of waste fluid which is collected and injected at high pressures into disposal wells.

Earthquakes caused by this type of wells is not the paranoid imaginings of environmental zealots. In this recent report, the Oklahoma Geological Survey studied a swarm of earthquakes in Garvin County, OK in January of this year, which began within hours of a new and deep hydro-fracturing project nearby.
The Geological Survey stopped short of definitively blaming the earthquakes on the fracking operation, but noted
"...The strong correlation in time and space as well as a reasonable fit to a physical model suggest that there is a possibility these earthquakes were induced by hydraulic-fracturing."
Typically, the report concludes that it is impossible to say definitely if the quakes were caused by fracking, but other locations hit by anomalous quakes are satisfied they have enough proof.

In June of this year, Cuadrilla Resources was forced to suspend its' fracking operation after it admitted it caused a swarm of earthquakes in normally seismically peaceful northwest England.

February, 2011: Arkansas Oil and Gas Commission issued an emergency moratorium on new injection wells due to a 4.7 magnitude quake as part of a suspicious swarm of earthquakes in the area.

In August of 2009, Chesapeake, one of the largest oil and gas exploration companies in the area, was forced to shut down two wells in the Dallas/Fort Worth area after they were linked to a swarm of earthquakes.

In other events, fracking was suspected, but no action taken:

Drilling Rig
In October 2011, a rare 4.8 magnitude quake hit in near San Antonio, Texas. This occurred in an area which has been heavily drilled and fracked.

In August of 2011, a rare 5.8 quake hits Virginia and is felt all over the east coast. Speculation ensues that dramatically increased fracking in the area may have triggered this rare event.

The same day, a southern Colorado 5.3 earthquake occurred in the same area as a swarm of possible fracking induced quakes investigated in 2001 by the USGS. Three months before the August 2011 quake, the EPA announced plans to study the impacts of fracking on drinking water in that area.

No one knows the impact of the large-scale fracturing of the earth's crust or the ramifications of high pressure injection of waste fluids deep underground. The evidence is mounting of a direct connection between these practices and unusual earthquake activity, but it may be impossible to scientifically prove.

These strange quakes have caused quite a bit of damage but no deaths so far. Energy companies continue to deny any connection between earthquakes and fracking or disposal of fracking fluids. The evidence, however, continues to pile up.

How many 'rare' earthquakes does it take before we acknowledge a pattern? And what price are we willing to pay, in lives and property, for the promise of 'cheap' energy?

*Update: March 09, 2012:
A series of Ohio earthquakes were caused by fracking according to State officials. Tough new laws will regulate drilling and disposal of waste.

In this report, the Ohio Department of Natural Resources said:
"After investigating all available geological formation and well activity data, ODNR regulators and geologists found a number of co-occurring circumstances strongly indicating the Youngstown area earthquakes were induced," state officials stated. "Specifically, evidence gathered by state officials suggests fluid from the Northstar 1 disposal well intersected an unmapped fault in a near-failure state of stress causing movement along that fault."

Article first published as Flurry of Earthquake 'Anomolies' Caused by Fracking? on Technorati.

images:
drill rig: http://commons.wikimedia.org/wiki/File:Drilling_for_gas_and_oil_in_Dalby_Forest_February_2007_-_geograph.org.uk_-_343560.jpghttp://commons.wikimedia.org/wiki
seismograph: http://commons.wikimedia.org/wiki/File:Kinemetrics_seismograph.jpg

Sunday, November 6, 2011

Debian Beckons Ubuntu Refugees to Come Home

Debian Live desktop
Dissatisfaction continues over Ubuntu's choice of the Unity Interface as default and, in the most recent release, no obvious way to return to the old Gnome desktop.

Long time Ubuntu users have been complaining loudly about Unity's lack of stability, limited options and an overall unfinished feel.

Distros that have watched Ubuntu gobbling up the Linux mind-share are suddenly getting a second look by unhappy Ubuntu users seeking alternatives to Unity.

Ubuntu started life as a simplified Debian with an emphasis on desktop usability. Recent Ubuntu releases seem focused on blazing their own trail toward a touchscreen, cloud enabled, widget driven environment. This may prove to be a very forward thinking plan, but it leaves traditional Gnome users hungering for their familiar desktop environment.

I decided to take another look at Ubuntu's parent, Debian. They offer live cd/dvds so I downloaded the i386 dvd .iso of the current stable release 6.0, aka 'Squeeze'. (All the Debian releases are named after characters from the 'Toy Story' movies.)

As a test machine, I scrounged up an ancient Dell Inspirion 1150. This dinosaur sports a 30 GB harddrive, 2.6 GHz processor, Wifi and 512  MB of RAM. Although I love my bling, I did not test compiz on this box due to the low specs.

Debian calls itself "The Universal Operating System" and nothing beats its support for a wide variety of hardware and architectures. Sound, video and ethernet were configured and worked automagically from the live cd.

Clicking the Install icon opens a graphical installer which walks the user through the usual steps: language, location, keyboard and timezone. Enter user and administrator passwords, computer and host names.

The partitioner offers a simple, guided install for a range of configurations. More advanced setup options are available by selecting 'Manual' install. The installation is essentially the same as most gnome-gui based installers and virtually painless.

The desktop is plain, vanilla Gnome 2 with a cartoonish space wallpaper (see above). There are many more wallpapers included by default and Ubuntu users will recognize many beautiful favorites, including 'Cosmos' the space slide show.

Live Earth Wallpaper on Debian/Gnome2
I replaced the default background with a Live Earth wallpaper that updates hourly throughout the day. You can find installation instructions here.
 
Neither ethernet or wifi would work after install. A quick google search found this documented bug and fix. Apparently, networking auto-configure is disabled by default.

I edited (as root) /etc/NetworkManager/NetworkManager.conf to show managed=true and following a restart of network manager, ethernet finally connected.

This Dell machine uses an old Broadcom wireless chipset, notoriously difficult to get working due to the closed nature of their firmware. Thankfully, in September of 2010, Broadcom finally began offering fully open Linux drivers for their chipsets.

Per instructions here, I entered in a terminal: sudo apt-get install firmware-b43-installer .Following a reboot, my wireless network was located and after entering my password, I was online in seconds.

I updated my repositories to include non-free software packages  Replace 'Squeeze' with 'Testing' for more current software updates, but slightly more breakage. Upgrades are incremental or 'rolling.' No need to reinstall every six months.

Debian uses apt for package management, with Synaptic as the familiar front end. Update manager notifies you of available updates and the Software Center makes adding and removing applications easy.

Debian came with gnash, the open implementation of Adobe Flash, installed by default. If you prefer Adobe's version, it is available in the repositories.

User friendly distros like Ubuntu and Mepis were built on Debian's stability, massive software repositories and superior apt package management system. Debian may lack the polish of these derivatives, but it is also a blank canvas, ready to take on your own look and feel. Experiment with new themes, icon sets, wallpapers and more at gnome-look.org

I was pleasantly surprised at the improvements in usability and ease of installation in Debian. Gnome2 seems as comfortable and familiar as an old pair of slippers. I think I will give Debian another try as a Desktop OS and it feels surprisingly good to come home.

A full review of Debian Squeeze can be found here.

Article first published as Debian Beckons Ubuntu Refugees to Come Home on Technorati.

What Went Wrong at Fukishima? 24 Hours to Meltdown

Reactor 3 Explodes at Fukishima
A report by the IEEE (Institute of Electrical and Electronics Engineers) takes a close look at what went wrong at the Fukishima Nuclear Power Plant following the March 11, 2011 earthquake and tsunami.
 
With billions of dollars in research and technology invested in nuclear energy, the report identified six common-sense and seemly obvious lessons which could have minimized or prevented the impending meltdown.

A massive 9.0 quake struck at 2:46 pm on March 11, 2011 off the east coast of Japan. At the Fukushima Dai-ichi Nuclear Plant, operated by Tokyo Electric Power Co. (TEPCO), Unit numbers 1,2 and 3 of the six reactors were operating. #4, 5 and 6 were down for scheduled maintenance. The quake caused the plant to perform a routine auto-shutdown without incident.

Power outages caused by quake were widespread. Within 10 seconds, twelve diesel generators activated to power water pumps for cooling to the fuel rods. So far, all emergency procedures are working as planned.

At 2:52 pm Unit 1 using a non-electrical isolation condenser (IC) backup cooling system was cooling the reactor too quickly and a plant supervisor shut it down, per normal procedure.

Tsunami alerts predicted a 3 meter (9.8ft) high wave would strike the Fukushima prefecture. The Dai-ichi Plant is 10 meters above sea level (33 ft). For safety, non-essential personnel began evacuating plant.

The first wave struck at 3:27. With a second set of much bigger waves arriving at 3:35 pm. It has been approximately fifty minutes since the quake.

The second huge wave topped seawalls and surged through plant. It destroyed heat removal seawater pumps and inundated the control rooms controlling valves, pumps and other crucial equipment. Later, TEPCO employees would estimate the killer wave at 14 meters high (46 ft) from water stains on the walls.

Six generators located in basements were drowned and five more shut down when control rooms were flooded. Only one generator serving reactors 5 and 6, not located in a basement, continued operating. This lone functioning generator helped units 5 and 6 survive the disaster while the other reactors spiraled out of control.

Lessons from Fukishima
Even back up batteries failed and Reactor 1 suffered a complete power failure. The control room went dark and instrument panels stopped functioning. Cooling system pumps failed and the water which was supposed to be cooling radioactive fuel rods began to boil. Steam built inside the reactor building. Without working gauges and instruments, operators were not sure of how much water was left to cool the rods.

The non-electrical IC cooling system serving Reactor 1 had been shut down early in the crisis, due to it working too well. Now, without power,  plant operators were unable to reopen the valves, even manually.

Operators struggled to regain power at the plant.  They scavenged batteries out of cars in the parking lot and called out a small fleet of power-generating trucks. However,  the earthquake and tsunami ruined roads and mass evacuations clogged highways and these trucks promptly became stuck in traffic.

At 4:36 TEPCO finally officially alerted the Japanese government of the problem at Reactor 1.

Around 9 pm and working by flashlight, operators ingeniously powered up a few important instrument panels using the scavenged car batteries and were relieved to see that the water cooling the fuel rods in Reactor 1 seemed to holding up so far. Water levels were down, but the rods were not exposed.

Later, company analysis showed the instruments were incorrect. The water level had dropped so low the rods were completely exposed. Temperatures had topped 1300 °C (2372 °F) and the meltdown had already begun.

Around midnight, more instruments were brought online and showed that dangerous pressure inside the containment vessel had already exceeded its' maximum design and an explosion was a serious risk.

Teams struggled through the night and next day to vent the explosive pressure in the containment vessel and cool the rods. Power trucks finally arrived and prepared to restart pumps cooling the crippled reactor.

Fire hoses poured on fresh water until tanks were empty, then in desperation started using highly corrosive sea water. This was an tacit admission that saving the plant was no longer an option and now the focus was on preventing a massive nuclear disaster.

Unknown to operators, the meltdown was proceeding. Superheated fuel rods had begun to melt through the steel floor of the pressure vessel. Pressure built inside the reactor as residents within a 10 km area around the plant (6.2 miles) were evacuated.

Attempts to release the pressure from hydrogen gas inside the reactor continued but were not enough to prevent a catastrophic explosion almost exactly 24 hours after the tsunami hit the plant.

The explosion cut off power from the trucks and severed the fire hoses. The flow of cooling water ceased as radiation levels climbed and plant operators scrambled for safety.

The disaster continued to spiral out of control. The plant was now a radioactive hot spot and choked with debris from the tsunami and explosion of reactor 1. Workers struggled to cool Reactors 2 and 3, but without power or pumps, Reactor 3 exploded on March 14, followed by a possible explosion inside number two later that day. Later, another explosion tore the roof off building four.

As this slow-motion catastrophe unfolded, workers fought gallantly to contain it but efforts were continually hampered by the lack of power which caused pumps to fail and rendered safety controls useless. Japanese officials later admitted that three reactors suffered full meltdowns.

Certainly, lessons will be learned from this disaster. Nuclear Power Plant designers worldwide will be studying Fukishima for years to come and will develop better system designs and disaster plans.

But looking at how events unfolded, it was mainly a lack of planning in the common-sense, low tech processes which brought Fukishima to its' knees.

Article first published as What Went Wrong at Fukishima? 24 Hours to Meltdown on Technorati. 

Images:
Reactor 3 image: http://en.wikipedia.org/wiki/File:Fukushima_I_by_Digital_Globe.jpg

Lessons background image: http://www.flickr.com/photos/bagalute/5127578547/sizes/m/in/photostream/  

Monday, October 31, 2011

Android Powered Smart Watches Coming Soon

I'm Watch
First published as "Android Powered Smart Watches Coming Soon" on Technorati

A new generation of smart watches is ready to hit the market. Sporting Android OS, these watches do much more than tell time, they provide quick access to email, text, social networking sites, videos and even take phone calls.

Two start up companies are poised to enter the market. The "I'm Watch" from Italy does not support wifi, but will tether via bluetooth to your smartphone and provide web access using your existing data plan. Download music, videos and apps from the new "I'm Market", which is the I'm Watch online store.

The I'm Watch sports a curved 1.5 inch, 240 x 240 pixel multi-touch screen and weighs about 70 grams or about 2.5 ounces. It comes in a rainbow of colors and styles and is powered by a Freescale IMX233 CPU (454MHz) and 64 MB of memory. Its 450mAh battery can run for about 24 hours on standby and about four hours when running apps. Speaker and mic are included.  (specs are here: .pdf)

The I'm Watch starts at 299 euros or about 423 dollars. With 50% up front, delivery is expected in about 90 days.  It currently runs Android Donut 1.6 for its' lower memory consumption, but backports for Froyo and Gingerbread are available.

WiMM Labs is offering up a platform for a variety of small, Android powered devices which provide wifi and bluetooth connectivity, an accelerometer, magnetometer and vibrate alerts.

They envision a range of smart gadgets including wearable watches, pendants and belt clips. Other useful WiMM powered devices might include clip on bike models and usb powered mini-computers.

The WiMM powered watch features a 1.4 inch color, touch screen display. It supports up to 32 MB of memory, its water resistant and only weighs 22 grams or less than one ounce.

Use it to screen calls, texts and social media. It will come loaded with custom watch faces and handy apps for caller ID, Calendar, Weather and more.

No more digging through a cluttered purse or fumbling for a smartphone in your pocket. Get ready for a new generation of android powered mini devices which will make information, networking and social media even more accessible and useful than ever before.

DIY - Ghost Hunters Toolkit


Do you believe in ghosts? Millions of people do and stories of haunted places and the spirits that dwell there have persisted throughout history.

From the two thousand year old story of a ghost of an old man, complete with rattling chains, to the ancient Roman festival of Lemuria, which residents performed a ritual exorcism to clear their homes of evil spirits of the dead, ghosts have haunted our homes and dreams through the ages.

Studies show about one third of people believe in ghosts and millions have had experiences they cannot explain. TV shows like Ghost Hunters and Paranormal Witness are among many aimed at believers and the curious.

You don't have to be a professional ghost hunter to do your own investigation. The tools of the trade are widely available online and even as apps for your smart phone. Here's a roundup of the most common ghost-busting tech to liven up your Halloween.

Basic equipment should include a camcorder with a good quality microphone and night vision. Use webcams for remote rooms. Connect these to a central computer to record paranormal activity.

A digital camera with good resolution and an old fashioned 35mm camera for capturing ghostly images.

Cassette tape recorder. The analog method of recording sound can capture background noises which may be missed by digital recordings.

Motsha Ghost Detector
A classic ghoul finder, EMF Detectors register changes in the electronic and magnetic fields. These are relatively inexpensive and even are available as an app for your smartphone.

Check out Motsha Ghost Detector and Ghost Meter-PKE Detector, both free from the Android Market. iPhone users can try Phantom Radar or Ghost Radar among others.

A compass is handy as a backup EMF Detector. It will also respond to changes in the magnetic field.

A thermometer is essential for verifying temperature fluctuations and identifying hot and cold spots.

Finally, don't forget the flashlights and extra batteries for all your electronic equipment.

The stories have persisted for thousands of years. Finally, we can use modern technology to help answer that age old question, "Do you believe in ghosts?"

First published as DIY- Ghost Hunters Toolkit on Technorati by Michelle Blowers

ghost window image: http://www.flickr.com/photos/calliope/5054420781/sizes/m/in/photostream/ 

Millions of Tons of Tsunami Refuse Could Reach US Coast

 First Published as "Millions of Tons of Tsunami Refuse Could Reach US Coast" on Technorati

image: JAMSTEC, IPRS, NOAA, NASA
Millions of tons of debris from the massive 9.0 earthquake and tsunami that devastated Japan in March, 2011 is making its way across the Pacific Ocean and will  eventually reach the west coast of the United States.

Following the record quake, crushing tsunami waves reached 133 ft (40.5 meters) in height and traveled up to six miles (10 km) inland. Nearly twenty thousand people were killed or have gone missing.
'
When the massive waves retreated, they pulled out to sea millions of tons of demolished houses, cars, furniture and the remnants of thousands of lives. Some of these items will sink as they move across the ocean, but many will not.

A staggering five to twenty million tons of refuse, containing everything from house parts, appliances and the minutia of peoples' lives will likely begin to arrive at Midway Islands, which lies between Japan and Hawaii, sometime this winter. The debris plume is estimated to be two thousand miles long and a thousand miles long.

Debris Plume image: US Navy
The mass of floating refuse will reach Hawaii in the winter or spring of 2013 and finally wash ashore in 2014 along the beaches of North America's west coast from British Columbia and Alaska through Washington, Oregon and California. It is not expected to contain radioactive material.

This prediction is the result of a model ( .pdf) developed by Nikolai Maximenko, a senior researcher at the International Pacific Research Center in Hawaii. He studied thirty years of ocean currents using data from thousands of buoys dotting the ocean. Recently a Russian ship passing between Honolulu and Vladivostock, Russia spotted the Japanese debris field just where Maximenko's model predicted it would be.  You can view an animation of the projected path here.

Whatever remains of this giant plume of trash and debris will eventually make its' way to the infamous 'North Pacific Garbage Patch' a giant vortex of mostly chemical and plastic garbage from the US and Japan which accumulates and is trapped by ocean currents in a huge area of the Pacific Ocean.

This marine garbage collection point was predicted in 1988 by NOAA and verified in 1997. Estimates of size vary greatly but range from  270,000 sq mi (700,000 square kilometres) to more than 5,800,000 sq mi (15,000,000 square kilometers). There is also a garbage patch accumulating in the Atlantic.

Massive Study Proves Climate Change is No Hoax

 First published as, "Massive Study Proves Climate Change is No Hoax" on Technorati

An independent review of more than one and a half billion temperature records from fifteen sources over more than a century  clearly shows that the planet is warming. Researchers at the University of California, Berkeley have confirmed previous reports that global temperatures have risen by 1 degree Celsius or nearly two degrees Fahrenheit overall.


The team included a 2011 Nobel Prize winner in Physics, Saul Perlmutter and various climatologists and statisticians. Compiling a huge open database of temperature records, researchers found a striking correlation with earlier American and British studies and the data clearly supports the conclusions that warming is occurring.

This study is the most comprehensive and thorough to date and focused on some nagging questions about the debate. According to this statement:
" The most important indicator of global warming, by far, is the land and sea surface temperature record. This has been criticized in several ways, including the choice of stations and the methods for correcting systematic errors. The Berkeley Earth Surface Temperature study sets out to to do a new analysis of the surface temperature record in a rigorous manner that addresses this criticism. We are using over 39,000 unique stations, which is more than five times the 7,280 stations found in the Global Historical Climatology Network Monthly data set (GHCN-M) that has served as the focus of many climate studies."
Skeptics have long criticized the use of data sources considered of 'poor' reliability. However, the results remained surprisingly consistent regardless of the perceived accuracy of some reporting stations. Despite local variations, the overall trend remained the same as stations considered 'reliable'.
Researchers found that there is an 'urban heat effect' which is significant to the local area. However, as less than 1% of the global land area is urban, these 'heat islands' were not deemed significant to the planet's overall climate.

Confusing the issue is data that shows one third of stations, many in the US and northern Europe, reported cooling over the last seventy years.

These cooler readings could be due to local variations and expected background 'noise' in the data. It may also be evidence of uneven warming of the planet.
Changes in ocean currents, wind and storm patterns will undoubtedly cause some areas to experience colder, wetter winters and more severe storms. 

It also shows that in those areas experiencing warming, temperatures may have risen much higher than the global average of 1 degree Celsius.

In fact recent studies show temperatures at the poles continue to rise faster than other areas of the globe. Disproportional heating at the poles has contributed to the rapid disappearance of sea ice and thawing of permafrost, releasing even more CO2.

You can review the data here  and draw your own conclusions. The evidence seems overwhelming but a vocal minority of skeptics and a lack of political courage continue to thwart any action as temperatures rise and we move ever closer to a possible climate 'tipping point'.

Sunday, October 23, 2011

Heads Up! Another Satellite is Falling to Earth


Article first published as Heads Up! Another Satellite is Falling to Earth on Technorati.

Pieces of another crippled satellite, this one the German ROSAT, are expected to fall to Earth this weekend, with perhaps thirty fragments surviving re-entry, including a one and a half ton, heat-resistant mirror traveling at 280 mph (450 kph).


The orbit of ROSAT takes it over the highly populated area between 53 degrees north and 53 degrees south latitudes. You can follow the current orbit here and view re-entry statistical animations.

Odds of someone actually being struck by a piece of the defunct ROSAT satellite are about one in 2000. Those are higher than the one in 3200 chances of being hit by the UARS satellite last month.

The six ton UARS (Upper Atmosphere Research Satellite) fell out of orbit and plunged into the Pacific Ocean around midnight on Sept 23, 2011. It briefly captured the worlds attention and spawned sales of 'I Survived The UARS Satellite Crash' T-Shirts.

Earth's orbit is clogged with the debris of fifty years of space exploration. A cloud of space junk orbits our planet which consists of more than 370,000 items including chunks of old satellites and rocket booster engines. Occasionally, the orbits of some of these objects degrade to a point where they are captured by Earths' gravity and fall toward the ground. Most of these are small and burn up upon re-entry.

Scientists are concerned we may have reached a 'tipping point' where continuous collisions could make passing through Earth's orbit too dangerous, potentially ending space travel as we know it.
 
The problem reached a critical level when the Chinese tested an anti-satellite weapon in 2007 which may have added 150,000 new pieces to the debris field. This was followed by a collision of two orbiting satellites in 2009. These two events doubled the number of objects in the space junk cloud and may have brought us perilously close to being trapped on Earth.

This gauntlet of debris may pose too great a risk for astronauts to pass through, and recently forced six residents of the International Space Station to take shelter in two docked Russian Soyuz spaceships when a piece of space debris threatened to strike the station. 
 
United States Defense Advanced Research Projects Agency (DARPA) released a report in 2009 called 'Catchers Mitt' which looked closely at the problem.

Strategies to deal with this envelope of space trash have ranged from lasers, nets to giant space sweepers, but the problem continues to snowball and no action has been taken as yet.

images:
http://www.flickr.com/photos/gsfc/4384863741/sizes/m/in/photostream/ 

European Space Agency: http://www.esa.int/

Thursday, October 20, 2011

Group-Think: Peer Pressure Shown To Alter Memories

Our memories are effected by what others think
 Article first published as Group-Think: Peer Pressure Shown To Alter Memories on Technorati.

Why is eye witness testimony notoriously unreliable? Why is advertising and propaganda so effective? Researchers have shown (.pdf) that our memories can be strongly effected by what we believe others perceive, and these memories can persist even when the manipulation has been disclosed.

Thirty adults watched a documentary style video in small groups. After three days, they were brought back individually and given a memory test to check their accuracy and confidence in what they had seen.

Four days later, they returned to take another memory test, but this time, first they were 'allowed' to see others' answers to the test questions. They were hooked up to a functional magnetic resonance imaging (fMRI) machine and shown false responses, supposedly by the other members of their small group.

The participants were tested again one week later and the results were striking. Over 68% of them had altered their memory of the event to conform with what they thought was the majority. Tests on control groups tested without social manipulation showed only 15% gave wrong answers.

Even after the full scope of the deception was revealed, more than 40% persisted in believing their falsified memory.

This shows how susceptible we are to group-think and peer pressure. It illustrates how easy it can be to manipulate someones' memories by convincing them that the majority saw it differently.

What causes this memory-shift? The MRI performed during the test on day seven provided some clues.

The MRI showed activity in the amygdala region of the brain, which effects how we remember social and emotional events. It seems to act as sort of a gatekeeper and influences the long and short term memories of the hippocampus.

This social and emotional influence over memory may have a survival component, allowing us to accept the guidance of the larger group, even when it conflicts with our own feelings.

Our tendency for 'herd mentality' may have contributed to the survival of our species, but we must guard against social memory-manipulation which could influence our opinions through advertising,  propaganda or tolerance for social injustice.


Wednesday, October 19, 2011

No More Surprises: Cell Phone Companies Agree to Overage Warnings

http://www.flickr.com/photos/angelshupe/
 Article first published as No More Surprises: Cell Phone Companies Agree to Overage Warnings on Technorati.

In an effort to avoid regulation, wireless providers have agreed to notify customers who are reaching their limits on voice, data, text and international roaming charges.

In a joint statement with the FCC, Consumers Union and CITA, a large wireless industry association, the major cell companies announced they will provide free warnings that users are about to incur overage charges on these commonly used services.

CITA members include wireless industry heavyweights such as AT&T, Verizon, T-Mobile, Sprint and TracFone.
Half these notifications must be working by Oct. 17, 2012, with the remainder up and running by April 17, 2013. This is good news for consumers.  According to this report, by the Wireless Consumer Association International, about 13.5 percent of customers will go over their plan's voice limit, and almost one in five (18%) will exceed their data limit.

Cell phone customers have long complained about 'bill shock', when unknowingly exceeding their plan limits. Companies have made billions off these charges, and agreeing to these notifications was only done to avoid mandates from Washington. The joint statement quotes President Obama,
“Far too many Americans know what it’s like to open up their cell-phone bill and be shocked by hundreds or even thousands of dollars in unexpected fees and charges... Our phones shouldn’t cost us more than the monthly rent or mortgage. So I appreciate the mobile phone companies’ willingness to work with my Administration and join us in our overall and ongoing efforts to protect American consumers by making sure financial transactions are fair, honest and transparent.”

The FCC and Policy Council for Consumers Union praised the agreement, saying more than 97% of wireless customers will benefit from the new rules and urged the companies to implement the notifications quickly.
CITA President and CEO said,

..."Today’s initiative is a perfect example of how government agencies and industries they regulate can work together under President Obama’s recent executive order directing federal agencies to consider whether new rules are necessary or would unnecessarily burden businesses and the economy.”

Because compliance is 'voluntary', the Consumers Union said, they are going to work closely with the FCC to make sure companies comply, "and we're pleased the Commission is keeping this proceeding open to help ensure compliance."

So, today cell phone customers enjoy a rare victory against surprise overages and Americans get a rare example of government, industry and consumer groups working together to protect users.

Tuesday, October 18, 2011

Broken Windows? Ubuntu Linux Saves the Day

Ubuntu Live CD Desktop
Article first published as Broken Windows? Ubuntu Linux Saves the Day on Technorati.

You may have heard about Linux. Perhaps you imagined it as a clunky DOS-like command line system, used by uber-geeks in dark basements and server rooms to perform their geeky techno-magic.

In truth, Linux has matured into an easy to use operating system, complete with a vast eco-system of free software that rivals Windows and Mac in simplicity and beauty. It rarely needs a reboot and viruses are virtually unknown.

Canonical has just released Ubuntu 11.10, it's latest version of the popular Ubuntu Linux distribution. It calls itself 'Linux for Human Beings' and it aims to be one of the most newbie friendly Linuxes. It's innovative 'Unity' GUI (graphical user interface) is designed for simplicity and functionality.

Ubuntu is not shareware or spyware. It is a full-featured system which is provided free of charge through 'Open Source' licensing.

Open Source is a philosophy as much as a license. It gives developers the right to use and build upon the progress of previous developers. Unlike proprietary software, the source code is provided for review, modification and distribution.

This type of collaborative development leads to rapid progress and innovation. Linux runs everything from desktop systems and smart phones to toasters and super-computers.

You don't have to install Ubuntu to try it out on your computer. Many Linux distributions are offered as 'Live CDs/DVDs which run completely off your CD/DVD player. They come with a full library of software pre-installed for everything from a MS Office compatible Office suite to photo editing and email.
 
Linux Mint is based on Ubuntu, but includes popular software, like Flash, Java and proprietary video drivers by default.

Simply download and burn the .iso (disk image) to a CD/DVD. Drop it in your computers' CD/DVD drive and reboot. Most everything works 'out of the box.' Video, sound, wifi, printers and networking. You can explore Linux and enjoy all the free software without risk. Simply eject the disk when your done and the computer reboots as before.

These live CDs are a great tool for malware infected Windows computers. In a pinch, they can provide you with a fully functional and secure system, loaded with the latest software and hardware drivers. Use them to back up files from a dying hard drive or fix a borked master boot record. At least you will have a working system until you can get Windows fixed (again).

Use a Live Linux CD to clean an infected Windows system yourself. Boot using a live CD and simply visit a reputable online virus scanner like TrendMicro, Bitdefender, Kaperskey, or use a free malware detection tool kit like the Kaspersky Rescue Disc.

If you choose to keep Linux on your computer, most Live CDs will walk you through installation, either completely replacing Windows or along side it (called dual-booting).

So head on over and download one of these great free systems. Explore thousands of free software titles from games to utilities.  Keep the live CD handy to rescue malware crippled Windows computers or retrieve data from an unbootable hard drive.

You don't have to be a geek to enjoy the freedom that Linux systems offer. They are simple to use, virtually virus free and solid as a rock.

And while you're at it, take a moment to remember Dennis Ritchie, a father of C and Unix, upon which most modern computing is based. He passed away last week, but these fundamental contributions will live on. RIP and thank you from all of us.