PROJET AUTOBLOG


The Hacker News

Site original : The Hacker News

⇐ retour index

Mise à jour

Mise à jour de la base de données, veuillez patienter...

Here's Why Microsoft Drops a Cloud Data Center Under the Ocean

mercredi 3 février 2016 à 11:30
microsoft-underwater-datacenter
Where tech companies like Facebook and Google prefer to move their data centers to colder countries to reduce their air conditioning bill, Microsoft has come up with an even better home for data centers while cutting high energy costs for cooling them: Under the Sea.

Here's what Microsoft says:

"50% of us live near the coast. Why doesn't our data?"

Building massive data centers underwater might sound crazy, but it is exactly something Microsoft is testing with its first submarine data center, dubbed Leona Philpot.

World's First Underwater Data Center


The testing is part of Microsoft’s plan dubbed Project Natick — an ongoing research project to build and run a data center that is submerged in the ocean, which the company believes, could make data centers faster, cost-effective, environmentally friendly and easier to set up.

Leona Philpot (named after the Halo character from Microsoft's Xbox) was tested last August, when engineers placed an enormous steel capsule a kilometer off the California coast, 30 feet underwater in the Pacific Ocean.

A single datacenter computing rack was placed in an eight-foot-wide steel capsule, which was covered in around 100 sensors to monitor every aspect of the underwater conditions: pressure, humidity, and, most importantly, motion.

The test ran from August to November last year (exactly 105 days) and the engineers said it was more successful than expected.

Why Underwater Data Center?


According to Microsoft, these are the main reasons for experimenting with underwater data centers:

1. Air conditioning cost is one of the biggest pain in running data centers. Traditional data centers are believed to consume up to 3 percent of the world's electricity.

So, placing the data centers in the ocean eliminated the need for cooling and will highly cut energy costs required to cool the heat generated by the racks upon racks of servers that process and store the world's digital lives.

2. Half of the world's population is located within 200 kilometers of the coast, so placing data centers in the sea would reduce latency – the time data takes to travel from its source to customers, which simply means faster delivery of data.

3. Reduce the time to build a data center from 2 Years to 90 Days. Microsoft believes that if it can mass produce the steel capsules, the company could build data centers in just 90 days.

This would make its operations cheaper and much quicker than the time needed to set a data center up on land.

Moreover, the capsules designed by the company would also adopt new, innovative rack designs that do not even need to consider human interaction.

4. Use of Renewable Energy. The project's engineers even believe that in future, underwater data centers might be able to power themselves by renewable energy, as in this case, perhaps underwater turbines or tidal power to generate electricity.

5. Environment-Friendly. Microsoft will also be tackling environmental concerns related to underwater data centers. The company says its current underwater data center prototype emits an "extremely" small amount of heat energy into the surrounding waters.

A Few Limitations:


Data centers on land are open for IT engineers to fix issues and replace servers whenever required, but the company wants its undersea data centers to go without maintenance for years at a time.

Since Microsoft doesn't have a team of Scuba engineers, each Natick data center unit would operate for over 5 years without maintenance and then it would be dragged up to the surface to have its internal parts replaced.

Other obvious risks for submarine data centers could be saltwater that is corrosive and weather that can also be a problem, to name just two potential hurdles.

Future Of The Data Center


The company started working on this idea in 2013, but the development of a physical prototype began in 2014 and August last year with its first ever submarine data server, Leona Philpot.

Since Microsoft's Project Natick has been in its "early days," it is hard to say when underwater data centers can actually adopted. However, Microsoft has plans to design a new version of underwater data centers that's three times larger than Leona Philpot.

It is not just Microsoft; many tech companies are considering new ways of housing data. In 2013, Facebook located one of its latest state-of-the-art data centers in Luleå, the far north of Sweden, to make use of cheap, renewable energy generated by hydroelectric schemes and outside air for cooling.

Wikileak's Julian Assange Could Be Set Free On Friday by United Nation

mardi 2 février 2016 à 18:15
united-nation-wikileaks
The decision of the United Nations investigation into the Julian Assange case is set to be revealed and could order the release of Wikileaks founder on February 5.

"BREAKING: UN set to announce decision on #Assange's release on Friday,"BREAKING: UN set to announce decision on #Assange's release on Friday," Wikileaks has tweeted.

Assange has been living in the Ecuadorian embassy in London for over 3 years, after being granted political asylum by the Ecuadorian government of the South American country.
UN Working Group on Arbitrary Detention (WGAD)
Assange has been residing in the embassy since 2012 to avoid extradition:
  • First to Sweden where he is facing sexual assault allegations, which he has always denied.
  • Ultimately to the United States where he could face cyber espionage charges for publishing classified US military and diplomat documents via his website Wikileaks.

The leak of publishing secret documents has amounted to the largest information leak in United States history. The US also launched a criminal case against Assange following the leak.

However, Assange filed a complaint against Sweden and the United Kingdom in September 2014 that has been considered by the UN Working Group on Arbitrary Detention.

The decision on the case will be published on Friday, and if the group concludes that Assange is being illegally detained, the UN is expected to call on the UK and Sweden to release him.

NASA HACKED! AnonSec tried to Crash $222 Million Drone into Pacific Ocean

mardi 2 février 2016 à 14:08
nasa-hacked-drone
Once again the Red Alarm had been long wailed in the Security Desk of the National Aeronautics and Space Administration (NASA).

Yes! This time, a serious hacktivism had been triggered by the Hacking group named "AnonSec" who made their presence in the cyber universe by previous NASA Hacks.

The AnonSec Members had allegedly released 276 GB of sensitive data which includes 631 video feeds from the Aircraft & Weather Radars; 2,143 Flight Logs and credentials of 2,414 NASA employees, including e-mail addresses and contact numbers.

The hacking group has released a self-published paper named "Zine" that explains the magnitude of the major network breach that compromised NASA systems and their motives behind the leak.

Here’s How AnonSec Hacked into NASA


The original cyber attack against NASA was not initially planned by AnonSec Members, but the attack went insidious soon after the Gozi Virus Spread that affected millions of systems a year ago.

After purchasing an "initial foothold" in 2013 from a hacker with the knowledge of NASA Servers, AnonSec group of hackers claimed to pentested the NASA network to figure out how many systems are penetrable, the group told InfoWar.

Bruteforcing Admin's SSH Password only took 0.32 seconds due to the weak password policy, and the group gained further indoor access that allowed it to grab more login information with a hidden packet sniffing tool.

They also claimed to infiltrate successfully into the Goddard Space Flight Center, the Glenn Research Center, and the Dryden Research Center.

Hacker Attempted to Crash $222 Million Drone into the Pacific Ocean


Three NAS Devices (Network Attached Storage) which gathers aircraft flight log backups were also compromised, rapidly opening a new room for the extended hack:

Hacking Global Hawk Drones, specialized in Surveillance Operations.

Hackers have tried to gain the control over the drone by re-routing the flight path (by Man-in-the-Middle or MitM strategy) to crash it in the Pacific Ocean, but…

…the sudden notification of a security glitch in the unusual flight plan made the NASA engineers to take the control manually that saved their $222.7 Million drone from drowning in the ocean.

This hacking attempt had happened due to the trivial routine of drone operators of uploading the drone flight paths for the next fly, soon after a drone session ends.

After this final episode, AnonSec lost their control over the compromised NASA servers and everything was set to normal by NASA engineers as before.

This marked the attack's magnitude at a steep height by infecting into other pipelines of NASA, leading to this nasty situation.

However, in a statement emailed to Forbes, NASA has denied alleged hacking incident, says leaked information could be part of freely available datasets, and there is no proof that a drone was hijacked.
“Control of our Global Hawk aircraft was not compromised. NASA has no evidence to indicate the alleged hacked data are anything other than already publicly available data. NASA takes cybersecurity very seriously and will continue to fully investigate all of these allegations.”

Why Did AnonSec Hack into NASA?


If you are going to point your fingers against the AnonSec Hackers, then Wait! Here's what the group of hackers wants to highlight:
"One of the main purposes of the Operation was to bring awareness to the reality of Chemtrails/CloudSeeding/Geoengineering/Weather Modification, whatever you want to call it, they all represent the same thing." 
"NASA even has several missions dedicated to studying Aerosols and their affects (sic) on the environment and weather, so we targeted their systems."

And Here's What NASA was actually doing:

  • Cloud seeding: A weather alteration method that uses silver iodide to create precipitation in clouds which results to cause more rainfall to fight carbon emission which ultimately manipulates the nature.
  • Geoengineering: Geoengineering aims to tackle climate change by removing CO2 from the air or limiting the sunlight reaching the planet.
Similar projects are running on behalf of the US Government such as Operation Icebridge [OIB], Aerosol-Cloud-Ecosystem (ACE) which are dedicated to climate modeling.

This security breach would be a black label for the Security Advisory Team of NASA and became a warning bell to beef up the security.

They Named it — Einstein, But $6 Billion Firewall Fails to Detect 94% of Latest Threats

mardi 2 février 2016 à 13:17
einstein-cybersecurity-firewall
The US government's $6 Billion firewall is nothing but a big blunder.

Dubbed EINSTEIN, the nationwide firewall run by the US Department of Homeland Security (DHS) is not as smart as its name suggests.

An audit conducted by the United States Government Accountability Office (GAO) has claimed that the firewall used by US government agencies is failing to fully meet its objectives and leaving the agencies open to zero-day attacks.


EINSTEIN, which is officially known as the US' National Cybersecurity Protection System (NCPS) and has cost $5.7 Billion to develop, detects only 6 percent of today's most common security vulnerabilities and failed to detect the rest 94 percent.

How bad is EINSTEIN Firewall in reality?


In a series of tests conducted last year, Einstein only detected 29 out of 489 vulnerabilities across Flash, Office, Java, IE and Acrobat disclosed via CVE reports published in 2014, according to a report [PDF] released by the GAO late last year.

Among the extraordinary pieces of information revealed are the fact that the system is:
  • Unable to monitor web traffic for malicious content.
  • Unable to uncover malware in a system.
  • Unable to monitor cloud services either.
  • Only offers signature-based threat and intrusion detection, rather than monitoring for unusual activity.
Yes, Einstein only carries out signature-based threat and intrusion detection, which means the system acts like a dumb terminal that waits for the command what to find, rather than to search itself for unusual activity.

Einstein Uses Outdated Signatures Database


In fact, more than 65 percent of intrusion detection signatures (digital fingerprints of known viruses and exploit code) are outdated, making Einstein wide open to recently discovered zero-day vulnerabilities.

However, in response to this, DHS told the office Einstein was always meant to be a signature-based detection system only. Here's what the department told the auditors:
"It is the responsibility of each agency to ensure their networks and information systems are secure while it is the responsibility of DHS to provide a baseline set of protections and government-wide situational awareness, as part of a defense-in-depth information security strategy."

Einstein is Effectively Blind


If this wasn't enough to figure out the worth of the $6 Billion firewall, Einstein is effectively Blind.

The Department of Homeland Security (DHS), which is behind the development of Einstein, has not included any feature to measure the system's own performance, so the system doesn't even know if it is doing a good job or not.
So, "until its intended capabilities are more fully developed, DHS will be hampered in its abilities to provide effective cybersecurity-related support to federal agencies," reads the report.
Einstein was actually developed in 2003 to automatically monitor agency network traffic, and later in 2009 expanded to offer signature-based detection as well as malware-blocking abilities.

Most of the 23 agencies are actually required to implement the firewall, but the GAO found that only 5 of them were utilising the system to deal with possible intrusions.

Despite having spent $1.2 Billion in 2014 and $5.7 Billion in total project, Einstein still only monitors certain types of network flaws along with no support for monitoring web traffic or cloud services.

Microsoft Starts automatically Pushing Windows 10 to all Windows 7 and 8.1 Users

mardi 2 février 2016 à 10:22
windows-10-upgrade-installation


As warned last year, Microsoft is pushing Windows 10 upgrades onto its user's PCs much harder by re-categorizing Windows 10 as a "Recommended Update" in Windows Update, instead of an "optional update."

Microsoft launched Windows 10 earlier last year and offered the free upgrade for Windows 7 and Windows 8 and 8.1 users. While the company has been successful in getting Windows 10 onto more than 200 Million devices, Microsoft wants to go a lot more aggressive this year.

So, If you have enabled Automatic Windows Update on your Window 7, 8 or 8.1 to install critical updates, like Security Patches, you should watch your steps because…

...From Monday, Windows Update will start upgrading your PC to the newest Windows 10 as a recommended update, Microsoft confirmed.

Must Read: How to Stop Windows 7 or 8 from Downloading Windows 10 Automatically.

This means Windows 10 upgrade process will download and start on hundreds of millions of devices automatically.

The move is, of course, the part of Microsoft's goal to get Windows 10 running on 1 Billion devices within 2-3 years of its actual release.
market-share-windows10
Market Share of Windows 10 is on the rise. It has already grabbed a market share of 11.85% as of January 2016, increasing from 9.96% in December. But, Windows 7 is still running on over 50% of all PCs in the world, so targeting even half of its user base would bring Microsoft very near to its goal.
"As we shared in late October on the Windows Blog, we are committed to making it easy for our Windows 7 and Windows 8.1 customers to upgrade to Windows 10," a Microsoft spokesperson said. "We updated the update experience today to help our clients, who previously reserved their upgrade, schedule a time for their upgrade to take place."

Also Read: If You Haven't yet, Turn Off Windows 10 Keylogger Now.

This means if the 'Give me recommended updates the same way I receive important update' option in Windows Update section is enabled on your PC, the Windows 10 update will not only be downloaded but also, the installation will be started automatically.
windows10-update
You are also required to stay alert because even if you have adopted manual updates you may still end up downloading Windows 10 anyway. As Windows Update is automatically pre-selecting the option for you, without your need to click on the box to get it.

Must Read: Just Like Windows 10, Windows 7 and 8 Also Spy on You – Here's How to Stop Them.

However, the company says that you won't be forced to upgrade the creepy OS as there will still be a prompt window that will require you to click through and confirm the Windows 10 upgrade after the files have silently been downloaded and unpacked in the background.

Even if the Windows 10 upgrade is accidentally completed, there is still a way to opt out of it. Microsoft is offering a 31 day grace period in which you will be able to revert to your old installation after trying Windows 10 and deciding you not like the operating system.

Though we know this revert will also be an aggressive push by Microsoft.
Error happened! 0 - count(): Argument #1 ($value) must be of type Countable|array, null given In: /var/www/ecirtam.net/autoblogs/autoblogs/autoblog.php:428 http://www.ecirtam.net/autoblogs/autoblogs/plusgooglecom108722708627977273008_4b868befb999be8d4a12cee6eafcf1d5f929d04b/?1618 #0 /var/www/ecirtam.net/autoblogs/autoblogs/autoblog.php(999): VroumVroum_Blog->update() #1 /var/www/ecirtam.net/autoblogs/autoblogs/plusgooglecom108722708627977273008_4b868befb999be8d4a12cee6eafcf1d5f929d04b/index.php(1): require_once('...') #2 {main}