Friday, 21 March 2014

Gupta: Cell phones, brain tumors and a wired earpiece

What’s your favorite feature of the radio accessory? Personally, I like the design job – Its cooler than an Inuit’s underpants!


By Dr. Sanjay Gupta, Chief Medical Correspondent


Just about every time I use a cell phone, I plug in my wired earpiece first. Having discussed the use of earpieces on several news shows, people expect to see me using one. If I am walking around the CNN studios, my colleagues often comment on it. In airports, people will stop me in the rare cases I forget to use the earpiece, and remind me about it. Perhaps, they are intrigued because I am a neurosurgeon who openly shows some concern about cell phones.


Truth is, it is a pretty easy thing to do – using an earpiece. Furthermore, my neck doesn’t hurt after being on the phone for a long conference call, and given that many of those calls take place in a car, an earpiece becomes a requirement. Still, though, I don’t want to dodge the obvious question: Do cell phones cause brain cancer?


It may be too early to say for sure. The latency period or time between exposure and recognition of a tumor is around 20 years, sometimes longer. And, cell phone use in the U.S. has been popular for only around 15 years. Back in 1996, there were 34 million cell phone users. Today there are 9-10 times as many. Keeping that in mind, it is worth taking a more detailed look at the results of Interphone, a multinational study designed to try to answer this question.


The headline from this study was there was little or no evidence to show an association between cell phones and cancer. Though, if you went to the appendix of the study, which interestingly was available only online, you found something unsettling. The data showed people who used a cell phone 10 years or more doubled the risk of developing a glioma, a type of brain tumor. And, across the board – most of the studies that have shown an increased risk are from Scandinavia, a place where cell phones have been popular since the early 1990s. For these reasons, the whole issue of latency could become increasingly important.


Cell phones use non-ionizing radiation, which is very different from the ionizing radiation of X-rays, which everyone agrees are harmful. Non-ionizing radiation won’t strip electrons or bust up DNA. It’s more like very low power microwaves. Short term, these microwaves are likely harmless, but long term could be a different story. Anyway, who likes the idea of a microwave, even a low-powered one, next to their head all day?


And, what about kids? I have three of them, aged 5, 4 and 2. Fact is, they are more likely to lead to my early demise than cell phones. But, as hard as it is to believe sometimes, they actually have thinner skulls than adults, and will probably be using cell phones longer than I ever will.


The first person to encourage me to regularly wear an ear piece was Dr. Keith Black. He also is a neurosurgeon, and makes a living removing – you guessed it – brain tumors. Keith has long believed there is a link, and for some time, his was a lonely voice in this discussion. Nowadays, he has loud and prominent voices accompanying him. Ronald Herberman, director of the University of Pittsburgh Cancer Institute, sent a memo warning staffers to limit their cell phone use. One of the possible consequences, he says, is an increased risk of brain cancer. The city of San Francisco is trying to pass an ordinance requiring radiation warning labels on all cell phones. The European Environmental Agency has said cell phones could be as big a public health risk as smoking, asbestos and leaded gasoline. Even the makers of cell phones suggest you don’t place a device against your head, but rather advocate holding it 5/8 to a full inch away.


Many will roll their eyes at this, scoffing at the precautionary principle on display here. Fair enough. Still, I like my wired earpiece, and I don’t have to turn my life upside down to use it. I also text and email a lot more, because my kids rarely allow me to have a phone conversation. Speaking of kids, you will probably see mine using earpieces too, when my wife and I decide they are old enough to use one, which isn’t in the foreseeable future.



Gupta: Cell phones, brain tumors and a wired earpiece

Sunday, 16 March 2014

Facebook Acquires WhatsApp for $19bn (£11bn)

Social media giants Facebook have purchased smartphone messaging app WhatsApp in a deal worth $19bn.


According to official statistics, WhatsApp has around 450 million monthly users. The makers of the app claim that it registers 1 million new users every day.


For those not in the know, WhatsApp is an Internet-based messaging service that allows people to get around text message charges. It works in much the same way that SMS (or ‘text messaging’) works, but crucially, it is free. For a small fee of $1 a year, it is possible to upgrade the service.


This is by far Facebook’s largest acquisition to date and has been met with some scepticism, but Facebook founder Mark Zuckerberg doesn’t seem daunted by the huge price tag; he described WhatsApp as “incredibly valuable” in a statement announcing the deal.


Prior to this deal, Facebook’s biggest purchase had been photography app Instagram, for which they paid $1bn.


Richard Taylor, North America Technology Correspondent with the BBC, said, “Some are seeing the $19bn price tag as further evidence of swollen valuations of companies as the sector experiences what may yet prove to be another dotcom bubble. WhatsApp does give Mark Zuckerberg inroads into international markets and, as importantly, to a younger demographic. But what is less clear is whether the finances will add up in the long term”.


The acquisition includes $4bn in cash, about $12bn in Facebook shares and about $3bn in stock options for WhatsApp founders and employees (of which there are around 50).


WhatsApp co-founder Jan Koum has also become a member of Facebook’s board of directors. “We’re excited and honoured to partner with Mark and Facebook as we continue to bring our product to more people around the world,” said Mr. Koum. Koum has also stated that he does not intend to allow advertising on the app.


Zuckerberg stated that he believed that WhatsApp was well on its way to having a Billion users.


In an interview with BBC News, senior research analyst with eMarketer Cathy Boyle said, “WhatsApp actually has greater penetration in a lot of international markets than Facebook,” It is possible that by linking the two services, Facebook will be able to increase its customer base. She then went on to say, “WhatsApp is trying to siphon the billions that the telecom industry would make from [traditional SMS text messaging]” if that is Facebook’s intention (and we have to consider it as one of them), then it actually makes good business sense.


SOURCES:


http://www.bbc.co.uk/news/business-26266689



Facebook Acquires WhatsApp for $19bn (£11bn)

Which Major Discoveries led to the Invention of the Two-Way Radio?

(Asked by ‘Scottish’ Pete from Woolwich)


Hey Pete, how’s everything? Thanks for your question.


…And what a question it is. The modern two-way radio, which is a direct descendent of the WW2-era Walkie-talkie, first became recognizable in the years just before the outbreak of World War 2. Its origins are an interesting story in their own right (but I’ll condense it here).


Three names are usually mentioned with regards to the invention of the walkie-talkie…


The first is Canadian inventor Donald Hings (1907 – 2004), who invented an early version of the technology back in 1937 (although it wasn’t widely acknowledged or used). Then, there’s American inventor Al Gross (1918 – 2000), who patented the name ‘walkie-talkie’ for his own invention a year later in ’38. Because of the ubiquity of the name, Gross became the best known ‘inventor’ of the technology at the time, even though it had technically existed for 12 months beforehand. However, this isn’t to detract from Gross’ claim, because his version of the walkie-talkie was actually quite different from Hings’ (despite operating on the same essential principles).


Then, there’s Dan Noble (1901 – 1980), a Motorola employee who, although he definitely did not invent the technology, certainly did lead the team that created the widely used WW2-era walkie-talkies. Hings’ version of the technology wasn’t used by the military until 1942, which led to Dan Noble being credited with the invention.


So, make of that mess what you will…


Now, to go back further (and get to the meat of your question), here is a list of discoveries that led to the creation of the two-way radio.


James Clark Maxwell (1831-1879), a mathematical physicist (and one of a seemingly endless line of genius Scotsmen) demonstrated that electromagnetic waves could propagate in free space in his 1865 paper ‘A Dynamical Theory of the Electromagnetic Field’ (of which the most famous fan was Albert Einstein). This led German physicist Heinrich Hertz (1857 – 1894) to build on Maxwell’s pioneering work by conclusively proving the existence of electromagnetic waves in 1887.


After that, Serbian-American inventor, physicist, vegetarian and absolute genius Nikola Tesla (1856 – 1943) demonstrated the transmission of radio frequency energy in 1892. After that, Italian inventor Guglielmo Marconi (1874 – 1937) built a wireless system capable of transmitting signals over unprecedented distances in 1895 – which is pretty much the birth of radio.


This was an important area of study at the time; the first wireless telephone conversation took place in 1880 and was made by Alexander Graham Bell (1847 – 1922), who was another Scot, incidentally. A lot of people were working on similar technology, so it would not have been unlike the ‘space race’ of the 50’s and 60’s at the time.


Marconi went about taking over pretty much all business related to the invention of the radio (which was, eventually, credited solely to him) and, by 1907, he had established the first commercial transatlantic radio service (and also pretty much screwed Tesla out of any/all royalties he would have been owed. Nice).


Thanks to the work of Julio Cervera Baviera (1854 – 1929) the Spanish army became the first to use radio for military purposes (at least, as far as I’m aware, anyway) in the early 1900’s.


Canadian inventor Reginald Fessenden (1866 – 1932) (who also helped to develop sonar and TV, incidentally), invented AM radio (no, not the ‘Breakfast Show’ –it means that more than one station can broadcast signals) when, on Christmas Eve 1906, he played some violin and read from the Bible.


Eventually, all ships were equipped with radio transmission capability, with Marconi owning a total monopoly over ship-to-shore communication. Ship-to-shore contact became a subject of increased awareness and importance following the Titanic disaster of 1912 and radios began to be seen even more as a crucial safety measure in all areas of industry as a result. Look up the 1913 ‘International Convention for the Safety of Life at Sea’ (it has a Wikipedia page, I just checked) for more info.


Skipping forward a bit, now. Throughout the 1930’s, there were a ton of minor (and major) improvements made to the technology, more than a few made by Marconi and his engineers. Some really clever people made their mark on the fledgling technology here, but if I mention them all, we’ll never get to the end.


Oh, by the way, FM radio was subsequently invented by American electrical engineer Edwin Armstrong (1890 – 1954) in 1933.


By the late 30’s, Hings comes into the picture, as does the rising spectre of a terrifyingly advanced Nazi Germany. The race was on to have the best equipped armies out there fighting the Axis powers and the allies wisely put a huge amount of manpower into the development of portable radio communication. It was a decision which led directly to the rapid co-opting of Hings and Gross’ work, as well as the later improvements made by Noble.


This is a long and fascinating story (about which many books have been written), but, as a ‘potted history’ of sorts, I hope that answers your question. 



Which Major Discoveries led to the Invention of the Two-Way Radio?

Tuesday, 11 March 2014

Out of Africa: Earliest Human Footprints Found in UK

The earliest evidence of Human footprints (outside of Africa, where most experts believe modern Humans first appeared) has been discovered in the United Kingdom.


The prints, believed to be some 800,000 years old, were identified on the shores of Happisburgh, a small village situated on the Norfolk coastline. The footprints represent a major prehistoric find, as they are direct evidence of the earliest known Humans in Northern Europe.


Dr. Nick Ashton, of The British Museum, said of the footprints that “(They are) one of the most important discoveries, if not the most important discovery that has been made on [Britain"s] shores,”


The hollow, foot-shaped markings were discovered during a low tide last year, when unusually rough seas exposed an area of sandy beach.


Sadly, the footprints were washed away fairly quickly, but they were visible long enough to be properly recorded, photographed and studied. Dr. Aston and his team worked hard to document the monumental discovery, even as heavy rainfall filled the tracks, “The rain was filling the hollows as quickly as we could empty them,” he told a BBC reporter.


Fortunately, the team was able to obtain a 3D scan of the prints. This scan revealed that the footprints likely belonged to a group consisting of an adult male and a few children. This has led some experts to speculate that the prints are those left by a prehistoric family group. The scan was so accurate, that the adult’s shoe size was determined to have been a comfortable 8.


Dr. Isabelle De Groote of Liverpool John Moore’s University was the first to confirm that the hollows were Human footprints. She told BBC that, “They appear to have been made by one adult male who was about 5ft 9in (175cm) tall and the shortest was about 3ft. The other larger footprints could come from young adult males or have been left by females. The glimpse of the past that we are seeing is that we have a family group moving together across the landscape.”


The family, however, were not modern Humans. Experts believe that they would have likely belonged to a group called Homo Antecessor. Remains of this extinct Human species (or possibly subspecies) have been found throughout Europe, most notably in Spain. They are thought to be among the continent’s earliest Human inhabitants.


It is generally accepted that Homo Antecessor was either a relative of Homo Heidelbergensis (an early Human considered most likely to be the direct ancestor of both modern Humans and Neanderthals), or else the same species. In either instance, h. Heidelbergensis is known to have lived in Britain about 500,000 years ago, which is about 300,000 years after changing temperatures are thought to have wiped out Britain’s Homo Antecessor population.


Homo Heidelbergensis is said to have evolved into Homo Neanderthalensis (Neanderthal Man), who lived, alongside our own Homo Sapien ancestors, until about 40,000 years ago, when the receding ice (and possibly competition for food) signaled the end for our last surviving sister species.


Interestingly, in 2010, Dr. Aston and his team discovered stone tools of a kind known to have been used by h. Antecessor in Happisburgh. It is a discovery that neatly compliments that of the footprints. This find, and other supporting material, effectively confirms the presence of early Humans in Britain about one million years ago.


According to Dr. Aston, the find will rewrite our understanding of British and European prehistory. To put that into perspective a little, the Happisburgh footprints are the only such find of this age to have ever been seen outside of Africa. Even then, there are only three specimens that are considered to be older across the African continent.


800, 000 years ago the earliest Britons left a lasting mark on the landscape. In so doing, they inadvertently sent us a message from the past about who they were and how they might have lived.


SOURCES:


http://www.bbc.co.uk/news/science-environment-26025763


http://en.wikipedia.org/wiki/Homo_antecessor


http://en.wikipedia.org/wiki/Homo_heidelbergensis



Out of Africa: Earliest Human Footprints Found in UK

Sunday, 9 March 2014

Look what I found on the Internet 03

For our next round of pictures we have, some funnies that have recently come up…..


 



 



 



 



 


thanks for looking!



Look what I found on the Internet 03

I need a new camera, Do i need one now i have my phone!

The Digital Camera has changed the way we take photos, the digital camera or Digicam is an electronic device that is able to take still photos as well as video, the digital camera records images with the use of an electronic image sensor and viewed on a mini screen on the camera or photo editor through a PC or MAC.

The digital camera is able to take photos at the click of a switch, excuse the pun! The photo can be seen instantaneously, particularly useful on a night out or for the important holiday snap, capture that moment with the use of the latest technology.

Great value cameras are now available at a very good price, they are designed to be portable and tiny, suitable for the casual snapshot and are commonly called point-and-shoot cameras. Many of the top companies Sony, Panasonic, Canon, Fujifilm, Samsung and Nikon all have top of the range models. The digital Camera now is full of features, the standard is now 14 megapixels with optical zoom of 5 times. HD digital camera capabilities are now standard with most cameras. GPS is now starting to be built in with the modern cameras, you can geo-tag photos with longitude and latitude capabilities.

Photos taken on a digital camera can be adjusted and improved through a photo editor, there are many on the market, including Photoshop and Pixlr, the photo editor can adjust and improve the images at the touch of a button taking out red eye, and other nasties, printing your pictures the way you want to see them.

Many cameras have portable SD cards that can be removed and put into a PC or Mac to be viewed, these come in lots of different sizes with the larger sized cards holding several thousand pictures. Many Cameras now have HDMI ports, to attach photos on your HDTV and monitors.


Friday, 7 March 2014

The Gadget with a Thousand Uses: How Science Fiction has Become Science Fact

When Francis Bacon wrote, “books must follow sciences, and not sciences books” it was 1657 and movies had yet to be invented, but I think, after reading the above quote, we can be fairly certain what his attitude would’ve been to the cinema…Well, here in the 21st century movies certainly do follow science, but sometimes they go one better and imagine bold new discoveries first…

A great many inventions have migrated from our imaginations and into our reality over the centuries. To some degree, imagination is the first step for every invention. In recent years, however, there seems to have been more ‘science fiction’ technology coming into reality than ever before…

But before we get to that, here are a few classic examples: Persian polymath (and personal hero) Al Jazari first imagined (and built) robots as far back as the 12th Century AD. Italian master inventor and artist Leonardo Da Vinci first conceptualised the helicopter, solar power and the calculator back in the 15th-16th centuries and in 1901, ‘Wizard of Oz’ author L. Frank Baum dreamed up a ‘character marker’ that took the form of a pair of glasses and worked in much the same way that AR (augmented reality) technology does today.

But that’s not all, not by a long shot.

The space craft first imagined by writer H.G Wells in ‘The First Men on the Moon’ became a reality in 1969 and, if British physics professor and sometime pop star Brian Cox is right, Wells’ time machine may not be too far away either (although, as always with time, its all relative).

Another favourite of mine was the elaborate setup of tape recorders employed by The Avengers’ Mrs Peel, which would automatically record a message in case Steed called her and she happened to be out. That was then. Now? Leave a message at the tone, Mrs. Peel…

So, what imaginary technology has recently made the jump from science fiction to science fact?

Firstly, there’s the interactive newspaper from the 2002 Steven Spielberg movie ‘Minority Report’, this newspaper was constantly updating itself as Tom Cruise’s character read through it. It was a fun piece of fiction, until, in 2010, German newspaper Süddeutsche Zeitung (SZ) made it into fact. Now, if you use a special smartphone app, you can bring some of their supplements to life in much the same way that the imaginary newspaper did back in 2002. It’s a trend that has caught on around the world.

Oh yeah, remember that bit in ‘2001: A Space Odyssey’ where the bloke eats some food while watching a video on a flat, slab-like screen? Well, my generation will be the last one to find that 1968 scene surprising. Our children will simply assume he’s using some sort of iPad (and a crappy looking one at that).

‘Star Trek’s dermal regenerator took its first steps towards the world of the real when scientist Jörg C. Gerlach invented what he calls a ‘skin cell gun’, its not yet approved by the FDA, but it has proven to be an effective way of re-growing skin following a bad burn (although it is unable to cure third-degree burns, sadly).

Also, its worth pointing out that earlier this year Paypal founder Elon Musk announced that he was working towards developing a viable ‘warp drive’ technology.

Put simply, everything begins life as an idea. To quote comic book author Grant Morrison’s 2011 book ‘Supergods’, (which also points to the Jack Kirby concept of ‘Mother Boxes’ and neatly relates them to modern smartphones and tablets) “the bomb, too, was only an idea that someone hammered into being”…

So what’s next? Well, close your eyes and imagine.


Monday, 3 March 2014

Some Funnies from the Internet 02

So having found some more Pictures that amused me, i thought i would post a few





















Reviews of the The iPad Air

In many ways, 2014 is going to be a lot like 2013, tablet-wise. Android tablets will continue to sell well, overly sensitive techie types will still cling to the idea that Windows 8 is simply ‘misunderstood’ (wilfully ignorant of the general consensus that it is complete and utter arse) and tablets will spring from the most unlikely of places (keep an eye out for the Co-Op ‘Tumblewumblebum’ and the Play.com ‘Zworfnik’). However, the trend least likely to change is that of an iPad leading the pack. We know this because Apple have just released the iPad Air…And it kicks major ass.

THE SPECS

Design wise, the iPad Air is all beautiful futurism and intelligent layout. The emphasis here is to make everything smaller, lighter and more travel-friendly, without losing the tablet’s most popular aspects. For starters, the bezel is 43% thinner than ever before and, as such, the overall device is now 28% lighter.

An Apple A7 chip bestows 64GB of pure power on the Air, thus ensuring that absolutely everything runs smoothly for the user. In short, this thing is all-but perfect.

THE PRICE

Ask someone who isn’t a bitter Microsoft crony what the worst thing about Apple products is and they’ll inevitably say the same thing as everyone else; “they’re too effing expensive!”

This is absolutely true.

However, Apple are taking steps to remedy this problem. Don’t believe me? The iPad Air, very probably the best tablet computer in the world right now, will be available at about £399 – £740. For Apple, this is pretty impressive.

THE PERFORMANCE

The top-end iPad Air is able to store umpteen million songs, as well as more apps than you can fling an Angry Bird at. In fact, you can use it for anything; books, movies, the lot.

But how does it handle? Well, to put it into technical terms, the iPad Air is smoother than an industrial sander on the pull.

However, there are still problems (I did say “all-but perfect”, after all). The 16GB version is completely inadequate for the download of apps, or general use, in fact. The 16GB iPad Air is a lot like the singers you see on those crappy TV talent shows, you know, the ones who attempt to croon along to all-time classic hits by warbling and trembling throughout the audition. Great material, oh yes indeedy, but frustratingly shortsighted execution. Still, as an overall flaw, it isn’t a biggie, it just means that I have to recommend a slightly more expensive version (oh, Apple!)



THE VERDICT

Were we reviewing the 16GB version, the following verdict might be slightly different, but we aren’t, we’re looking at this new iPad as a whole, with ALL versions under the spotlight. With that in mind, we can confidently say that the iPad Air is, at the time of writing, the best tablet in the world. ‘Nuff said!