Friday, August 1, 2014

Transitioning from Windows to Linux administration: A guide for newcomers

Windows and Linux are operating systems with many similarities and differences. Learn some tips to help cross the bridge between the two in the first of a multi-part series.


The "Mac vs. PC" television ads from several years back were entertaining but not necessarily accurate depictions of the mindset behind Apple and Windows users. In truth neither users nor administrators of any particular operating system can be easily categorized or pigeonholed. Apple computers aren't necessarily wielded exclusively by glamorous "go against the grain" hipsters nor are Windows systems relied upon by stodgy corporate heads.
However, it is a fact that all operating systems have an array of similarities and differences between them. While the similarities can provide universal standards to help orient new users and administrators, ...

Microsoft Shark Cove Review

Microsoft has finally joined the mini PC craze, introducing a Windows-compatible development board dubbed Sharks cove. 


First teased during its April Build conference, the Raspberry Pi-ish device is the result of the combined efforts of Microsoft, Intel and product manufacturer CircuitCo.

It's designed to facilitate development of software and drivers for mobile devices that run Windows, such as phones, tablets and similar SoC platforms, although it can also be used for Android development. 
The pint-sized PC features a 1.33GHz Intel Atom processor with integrated HD graphics, 16GB of EMMC storage, a MIPI connector for display and camera, HDMI, one USB 2.0 port and a micro-USB power port. Ethernet or wifi is available only through USB, meaning users will have to connect to the internet or other networks with a USB adapter.
At $299, the board is priced significantly higher than its Raspberry Pi or Arduino board counterparts. Microsoft said the price covers the cost of the hardware, a Windows 8.1 image, and the slightly vague "utilities" required to apply it to the Sharks Cove.
Still, Microsoft hopes the board will find a home with Independent Hardware Vendors and hardware enthusiasts willing to shell out eight times the cost of the $35 Raspberry Pi Model B. Even Intel's own Galileo board, which also targets developers, is significantly cheaper.
Microsoft ended its Sharks Cove blog post like this:
We're very excited and proud of the work done to make the Sharks Cove a reality. We are looking forward to seeing the amazing things that can be done with these boards!

Google glass defects

Google glass defects: Google Glass is really a innovative product within the nascent wearable computer industry. It has the possible ways to develop a complete fresh class, smart-glass, and an app ecosystem to go along with it.
Here’s the good, the bad, and some unanswered questions surrounding its imminent public release:
Good:
- lightweight
- a hella lotta hardware in a very miniaturized enclosure
- pioneering wearable form factor
- near instantaneous information at your fingertips “google…”
- sub second search results direct to your retina
- voice command
- private audio via bone conduction
- private video via pico-projection
- look like a total tech badass to your geek friends
- your choice of color
- super simple development API
- transparent overlay of video atop natural vision
- screen is subtle: covers a very small %age of total field of view
- bluetooth tethering to smartphone keeps radiation away from brain.
Bad:
- battery life
- the style factor – look funny to your non-geek friends
- excessive UX reliance on side-mounted touch surface
- lack of native app dev for developers
- all info & services must go through Google servers
- somewhat limited functionality of API
- price : $1500 + tax for developer prototype
- lock-in to Google ecosystem
- possible portal for advertising direct to your retina
- screen only covers a very small %age of total field of view
- monoscopic : only on one eye
- lack of 3G / 4G radio requires external smartphone or wifi hub.
TBA:
- style options for sunglass and prescription lens retrofits
- price for consumers : could hit a nice sweet spot
- what accessories will be available?
Conclusion:
Worth a spin. For developers, definitely worth $1500 for an opportunity to be in early days of App Store, Google Play, or equivalent smartglass marketplace.
For consumers, a maybe: will depend on individual fashion sense, available accessories, available apps, and pricepoint. For more updates about Google glass defects.


via: dsky9

Making 3D Technology Simple

ModernTech is the leading value-added reseller of 3D CAD, 3D Printers and Engineering Technology in the Southeast including Florida, South Carolina, North Carolina, Maryland, District of Columbia, Virginia, West Virginia, Georgia, Alabama, Arkansas, Mississippi, Louisiana, Texas and Tennessee. We have solutions to help you design better products, faster than your competition:SOLIDWORKS 3D CADAnalysisData ManagementStratasys 3D Printers, and more. Our combination of great people, excellent products and stellar services enable you to maximize your profitability and success. 

The Next Mars Rover Will Have Better Lasers and X-Ray Vision

NASA announced today that its next Mars rover will have advanced cameras, more sophisticated lasers, and the ability to see underground as it explores the Red Planet starting in 2020.
The mission, currently being called the Mars 2020 rover (until NASA can give it a better name), is a twin of the Curiosity rover currently on Mars. This duplication allows NASA to save money because they already had a spare machine sitting around, but Mars 2020 won’t just be carrying repeats of Curiosity’s state-of-the-art gadgets. Instead, the probe will be building on the scientific discoveries from Curiosity, preparing to return samples of Mars to Earth, and even paving the way for future human exploration. Here’s a breakdown of all the rover’s new gear.

Super Zoom

First up is the new rover’s souped-up camera system called Mastcam-Z, a multispectral binocular imager. We’re all used to the incredible photos that Curiosity sends back but this new camera will be able to shoot pictures in multiple wavelengths, allowing scientists to see things that would otherwise be invisible to our eyes. It will also be able to zoom, an ability that Curiosity’s 17 cameras sadly lack, which will make it possible for the rover to rapidly map out its surroundings, build terrain models, and plot out its path on Mars.

Rainbow Laser



Just like Curiosity, Mars 2020 will be carrying a laser that it will use to shoot unsuspecting rocks. But NASA is promising that the new rover’s laser will be even better than Curiosity’s. Called SuperCam, the instrument will incinerate small bits of rocks on the ground and then analyze the resulting vapor to determine their composition. Curiosity’s ChemCam instrument does a similar job, but Supercam will have the ability to examine the smoke in multiple wavelengths, including visible and infrared (thus the rainbow nickname), that will give it a better understanding of the types of minerals around it. This will help the science team decide whether or not certain rocks are better to investigate further, and which ones to take samples from.

Oxygen Maker

As part of NASA’s efforts to bring humans to Mars, the new rover has Mars Oxygen ISRU Experiment (MOXIE), a tool that can extract carbon dioxide from the atmosphere and break it apart to produce pure oxygen. This will be the first test of what’s known as in situ resource utilization—essentially using the stuff around you to live off the land—and could help astronauts produce breathable gases or rocket fuel on a future mission.

Weather Station

Mars 2020 will have a weather station called Mars Environmental Dynamics Analyzer (MEDA) that will record the local temperature, humidity, pressure, and wind speed. This instrument will also study dust in the atmosphere, analyzing its size and shape, another important part of one day sending humans, who will have to find a way to avoid getting their gear contaminated with this potentially lethal stuff.

Ground-Penetrating Radar


The new rover will have the awesome ability to see underground on Mars, using its ground-penetrating radar. Known as Radar Imager for Mars’ Subsurface Exploration (RIMFAX), this instrument will scan up to a third of a mile beneath the surface as the rover travels around. Mars 2020 will be able to resolve objects as small as an inch or two in size, giving scientists a glimpse of what goes on deep below their rover’s wheels.


X-Ray Fluorescence

NASA will be sending its new rover with an X-ray fluorescence spectrometer called the Planetary Instrument for X-ray Lithochemistry (PIXL) that can map out each element in a particular rock. This will allow geologists to look figure out what kind of minerals are in a rock sample. Because microbes need certain materials to thrive, these minerals could indicate places where we are more likely to find evidence of past life.

Organic Molecule Hunter

Finally, Mars 2020 will be carrying an instrument called Scanning Habitable Environments with Raman & Luminescence for Organics and Chemicals (SHERLOC), which will use an ultraviolet laser to scan for organic molecules. Also able to give a very detailed look at the mineralogy of rocks, SHERLOC will complement the X-ray abilities of PIXL. The science team is particularly interested in making sure that its instrument’s capabilities have some overlap, so that they can check and double-check their work.

Sample Return Prep

Though not a particular instrument, the new rover will be using all its sophisticated apparatuses to figure out what are the best rock samples for scientists back on Earth to study in more detail. It will use a drill to place around 30 pencil-sized rock cylinders inside of sealed canisters. One day, a future mission could pick up these jars and return them to our planet, fulfilling a dream that the planetary science community has long hoped for and giving them the chance to analyze recent pieces of Mars up close.

Dinosaurs were killed by 'colossal bad luck': Prehistoric creatures may have survived if the Earth was struck by an asteroid earlier

  • A study led by the University of Edinburgh says the dinosaurs may have survived if the asteroid struck several million years earlier or later
  • Many experts agree that an asteroid strike that hit what is now Mexico, caused the demise of many dinosaur species 66 million years ago
  • Creatures were already suffering environmental upheaval - with wide spread volcanic activity and changing temperatures -  which left them vulnerable
  • If the asteroid impact had come a few million years earlier, when the range of species was bigger and food chains more robust, they may have survived
  • If it had come later in history when new species had been given the chance to evolve, they might have escaped extinction, the experts said


Read more: http://www.dailymail.co.uk/sciencetech/article-2708171/Dinosaurs-killed-colossal-bad-luck-Prehistoric-creatures-survived-Earth-struck-asteroid-earlier.html#ixzz399Sd9Zt7
Follow us: @MailOnline on Twitter | DailyMail on Facebook

Modern technology is changing the way our brains work

Human identity, the idea that defines each and every one of us, could be facing an unprecedented crisis.
It is a crisis that would threaten long-held notions of who we are, what we do and how we behave. 
It goes right to the heart - or the head - of us all. This crisis could reshape how we interact with each other, alter what makes us happy, and modify our capacity for reaching our full potential as individuals. 
And it's caused by one simple fact: the human brain, that most sensitive of organs, is under threat from the modern world. 

Unless we wake up to the damage that the gadget-filled, pharmaceutically-enhanced 21st century is doing to our brains, we could be sleepwalking towards a future in which neuro-chip technology blurs the line between living and non-living machines, and between our bodies and the outside world.
It would be a world where such devices could enhance our muscle power, or our senses, beyond the norm, and where we all take a daily cocktail of drugs to control our moods and performance.
Already, an electronic chip is being developed that could allow a paralysed patient to move a robotic limb just by thinking about it. As for drug manipulated moods, they're already with us - although so far only to a medically prescribed extent.
Increasing numbers of people already take Prozac for depression, Paxil as an antidote for shyness, and give Ritalin to children to improve their concentration. But what if there were still more pills to enhance or "correct" a range of other specific mental functions?
What would such aspirations to be "perfect" or "better" do to our notions of identity, and what would it do to those who could not get their hands on the pills? Would some finally have become more equal than others, as George Orwell always feared?
Of course, there are benefits from technical progress - but there are great dangers as well, and I believe that we are seeing some of those today.
I'm a neuroscientist and my day-to-day research at Oxford University strives for an ever greater understanding - and therefore maybe, one day, a cure - for Alzheimer's disease.
But one vital fact I have learnt is that the brain is not the unchanging organ that we might imagine. It not only goes on developing, changing and, in some tragic cases, eventually deteriorating with age, it is also substantially shaped by what we do to it and by the experience of daily life. When I say "shaped", I'm not talking figuratively or metaphorically; I'm talking literally. At a microcellular level, the infinitely complex network of nerve cells that make up the constituent parts of the brain actually change in response to certain experiences and stimuli.
The brain, in other words, is malleable - not just in early childhood but right up to early adulthood, and, in certain instances, beyond. The surrounding environment has a huge impact both on the way our brains develop and how that brain is transformed into a unique human mind.
Of course, there's nothing new about that: human brains have been changing, adapting and developing in response to outside stimuli for centuries.
What prompted me to write my book is that the pace of change in the outside environment and in the development of new technologies has increased dramatically. This will affect our brains over the next 100 years in ways we might never have imagined.
Our brains are under the influence of an ever- expanding world of new technology: multichannel television, video games, MP3 players, the internet, wireless networks, Bluetooth links - the list goes on and on.


But our modern brains are also having to adapt to other 21st century intrusions, some of which, such as prescribed drugs like Ritalin and Prozac, are supposed to be of benefit, and some of which, such as widelyavailable illegal drugs like cannabis and heroin, are not.
Electronic devices and pharmaceutical drugs all have an impact on the micro- cellular structure and complex biochemistry of our brains. And that, in turn, affects our personality, our behaviour and our characteristics. In short, the modern world could well be altering our human identity.
Three hundred years ago, our notions of human identity were vastly simpler: we were defined by the family we were born into and our position within that family. Social advancement was nigh on impossible and the concept of "individuality" took a back seat.
That only arrived with the Industrial Revolution, which for the first time offered rewards for initiative, ingenuity and ambition. Suddenly, people had their own life stories - ones which could be shaped by their own thoughts and actions. For the first time, individuals had a real sense of self.
But with our brains now under such widespread attack from the modern world, there's a danger that that cherished sense of self could be diminished or even lost.
Anyone who doubts the malleability of the adult brain should consider a startling piece of research conducted at Harvard Medical School. There, a group of adult volunteers, none of whom could previously play the piano, were split into three groups.
The first group were taken into a room with a piano and given intensive piano practise for five days. The second group were taken into an identical room with an identical piano - but had nothing to do with the instrument at all.
And the third group were taken into an identical room with an identical piano and were then told that for the next five days they had to just imagine they were practising piano exercises.
The resultant brain scans were extraordinary. Not surprisingly, the brains of those who simply sat in the same room as the piano hadn't changed at all.
Equally unsurprising was the fact that those who had performed the piano exercises saw marked structural changes in the area of the brain associated with finger movement.
But what was truly astonishing was that the group who had merely imagined doing the piano exercises saw changes in brain structure that were almost as pronounced as those that had actually had lessons. "The power of imagination" is not a metaphor, it seems; it's real, and has a physical basis in your brain.
Alas, no neuroscientist can explain how the sort of changes that the Harvard experimenters reported at the micro-cellular level translate into changes in character, personality or behaviour. But we don't need to know that to realise that changes in brain structure and our higher thoughts and feelings are incontrovertibly linked.
What worries me is that if something as innocuous as imagining a piano lesson can bring about a visible physical change in brain structure, and therefore some presumably minor change in the way the aspiring player performs, what changes might long stints playing violent computer games bring about? That eternal teenage protest of 'it's only a game, Mum' certainly begins to ring alarmingly hollow.
Already, it's pretty clear that the screen-based, two dimensional world that so many teenagers - and a growing number of adults - choose to inhabit is producing changes in behaviour. Attention spans are shorter, personal communication skills are reduced and there's a marked reduction in the ability to think abstractly.
This games-driven generation interpret the world through screen-shaped eyes. It's almost as if something hasn't really happened until it's been posted on Facebook, Bebo or YouTube.
Add that to the huge amount of personal information now stored on the internet - births, marriages, telephone numbers, credit ratings, holiday pictures - and it's sometimes difficult to know where the boundaries of our individuality actually lie. Only one thing is certain: those boundaries are weakening.
And they could weaken further still if, and when, neurochip technology becomes more widely available. These tiny devices will take advantage of the discovery that nerve cells and silicon chips can happily co-exist, allowing an interface between the electronic world and the human body. One of my colleagues recently suggested that someone could be fitted with a cochlear implant (devices that convert sound waves into electronic impulses and enable the deaf to hear) and a skull-mounted micro- chip that converts brain waves into words (a prototype is under research).
Then, if both devices were connected to a wireless network, we really would have arrived at the point which science fiction writers have been getting excited about for years. Mind reading!
He was joking, but for how long the gag remains funny is far from clear.
Today's technology is already producing a marked shift in the way we think and behave, particularly among the young.
I mustn't, however, be too censorious, because what I'm talking about is pleasure. For some, pleasure means wine, women and song; for others, more recently, sex, drugs and rock 'n' roll; and for millions today, endless hours at the computer console.
But whatever your particular variety of pleasure (and energetic sport needs to be added to the list), it's long been accepted that 'pure' pleasure - that is to say, activity during which you truly "let yourself go" - was part of the diverse portfolio of normal human life. Until now, that is.
Now, coinciding with the moment when technology and pharmaceutical companies are finding ever more ways to have a direct influence on the human brain, pleasure is becoming the sole be-all and end-all of many lives, especially among the young.
We could be raising a hedonistic generation who live only in the thrill of the computer-generated moment, and are in distinct danger of detaching themselves from what the rest of us would consider the real world.
This is a trend that worries me profoundly. For as any alcoholic or drug addict will tell you, nobody can be trapped in the moment of pleasure forever. Sooner or later, you have to come down.
I'm certainly not saying all video games are addictive (as yet, there is not enough research to back that up), and I genuinely welcome the new generation of "brain-training" computer games aimed at keeping the little grey cells active for longer.
As my Alzheimer's research has shown me, when it comes to higher brain function, it's clear that there is some truth in the adage "use it or lose it".
However, playing certain games can mimic addiction, and that the heaviest users of these games might soon begin to do a pretty good impersonation of an addict.
Throw in circumstantial evidence that links a sharp rise in diagnoses of Attention Deficit Hyperactivity Disorder and the associated three-fold increase in Ritalin prescriptions over the past ten years with the boom in computer games and you have an immensely worrying scenario.
But we mustn't be too pessimistic about the future. It may sound frighteningly Orwellian, but there may be some potential advantages to be gained from our growing understanding of the human brain's tremendous plasticity. What if we could create an environment that would allow the brain to develop in a way that was seen to be of universal benefit?
I'm not convinced that scientists will ever find a way of manipulating the brain to make us all much cleverer (it would probably be cheaper and far more effective to manipulate the education system). And nor do I believe that we can somehow be made much happier - not, at least, without somehow anaesthetising ourselves against the sadness and misery that is part and parcel of the human condition.
When someone I love dies, I still want to be able to cry.
But I do, paradoxically, see potential in one particular direction. I think it possible that we might one day be able to harness outside stimuli in such a way that creativity - surely the ultimate expression of individuality - is actually boosted rather than diminished.
I am optimistic and excited by what future research will reveal into the workings of the human brain, and the extraordinary process by which it is translated into a uniquely individual mind.
But I'm also concerned that we seem to be so oblivious to the dangers that are already upon us.
Well, that debate must start now. Identity, the very essence of what it is to be human, is open to change - both good and bad. Our children, and certainly our grandchildren, will not thank us if we put off discussion much longer.
• Adapted from ID: The Quest For Identity In The 21st Century by Susan Greenfield, to be published by Sceptre on May 15 at £16.99. To order a copy for £15.30 (p&p free), call 0845 606 4206.


Read more: http://www.dailymail.co.uk/sciencetech/article-565207/Modern-technology-changing-way-brains-work-says-neuroscientist.html#ixzz399S1ege0
Follow us: @MailOnline on Twitter | DailyMail on Facebook