888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus
10
Oct
2017

Literally Every Yahoo Account Ever Hacked

by Bill

When a company has a major hack, you expect a few things. Probably, some accounts have been compromised, but maybe not all. Secondly, you may expect not all the information will be available right away. Yahoo has a lot to answer for, because it recently revealed a 2013 attack on their servers which was previously thought to have compromised about 1 billion accounts, but now the facts are becoming available and in actuality all 3 billion accounts in existence at the time. This massive revision of how many accounts were affected is very important for consumers, and also is significant because Yahoo was in discussions to be bought out by Verizon in the interim.

The information revealed may not be quite as serious as the sensitive financial data revealed during the most recent Equifax breach, but it is quite serious because it includes things like e-mail addresses, phone numbers, and passwords which can be used illegally in conjunction with the Equifax data to compromise the identities of millions of people globally.

This rash of hacks has many wondering what the point is of shredding old bank statements and phone bills before discarding them, when it is now so easy for nefarious parties to simple gather your hacked data digitally.

With this climate of hacking, it’s important to work with companies that you truly trust. You can count on NationalNet to provide you with hosting that will keep your data as secure as modern technology allows. Much is being done to research new forms of data security as previously reported here on the NationalNet Blog, including China’s quantum entanglement plans. As new tools become available on the commercial level, we will continue to implement them quickly… hopefully Yahoo and the credit bureaus will also do so as well.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
29
Sep
2017

Google Opens Up Search Ad Buys To Rivals With Little Hope Of Improvements

by Bill

Until now, Google has reserved spots on particular search times for itself, but now it has promised to open up bidding on those terms to rivals at the top of product search results in Europe. Google announced the change in response to a June ruling by the European Commission, which found in part that Google had systemically abused its dominance in search to gain competitive advantages. This is one of the three cases pending against Google in Europe that may lead to sweeping Antitrust regulatory action in the EU and beyond if Google fails to impress regulators with its remediation actions.

The June ruling by the commission ordered Google to pay €2.42 billion ($2.8 billion dollars) in fines, but analysts agree that one time expense is nothing compared to the longer term threat to their business model if regulators persist. For example, Paul Gallant, an analyst with Cowen, is quotes saying that Google is likely concerned that the order could also be applied to other verticals including travel and local search.

To strengthen its legal footing with EU regulators, Google also said Wednesday that Google Shopping (which is the core of the case) will start operating as a standalone unit in Europe, bidding against all companies for featured spots to help level the playing field, but experts and insiders are saying that change is far from being enough of a move to actually open up real competition.

As Maurice Stucke, a law professor and the cofounder of the Konkurrenz Group, explained, “Imagine if a company became a monopoly by burning down all the warehouses and factories of its rivals. If regulators instruct the company to stop burning things down, the monopoly will say, ‘Fine I can live with that. I promise not to burn down any more factories.” Yet clearly, that’s not a solution to the problem that persists.

It will be interesting to see which way these cases are resolved, with the balance of millions of search terms and billions or trillions of dollars at stake.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
22
Sep
2017

Light Has Been Stored as Sound For The First Time

by Bill

Scientists have discovered a way to store light-based information as sound waves on a computer chip. Why does that matter? The conversion is critical to shifting away from the inefficient electronic computers currently in use to a light-based computer that moves data at a much faster velocity than anything on the market right now.

Light-based photonic computers have the potential to run at least 20 times faster in theory, and they won’t produce heat or require anywhere near as much energy as existing devices. By processing data in the form of photons rather than electrons IBM, Intel and others have been seeking a method of slowing the data down from its light form so that the processing power of modern computers would be sufficient to handle it all.

Coding information into photons is surprisingly easy, as we do when we send information via optical fiber – however; retrieving information at the speed of light and then processing it is not yet possible because it’s simply too fast.  This new alternative slows down the light and converts it into sound, allowing researchers from the University of Sydney in Australia to access data at a speed far greater electronic computing, though significantly slower than the speed of light.

“The information in our chip in acoustic form travels at a velocity five orders of magnitude slower than in the optical domain,” said project supervisor Birgit Stiller. However, that also means that computers can still achieve amazingly high speeds with no heat caused by electronic resistance, and no interference from electromagnetic radiation.

“This is an important step forward in the field of optical information processing as this concept fulfills all requirements for current and future generation optical communication systems,” added team member Benjamin Eggleton.

First, photonic information enters the chip as a pulse of light, where it interacts with a ‘write’ pulse, producing an acoustic wave that stores the data. Another pulse of light, called the ‘read’ pulse, then accesses this sound data and transmits as light once more pulse of data.  While unimpeded light will pass through the chip in 2 to 3 nanoseconds, once stored as a sound wave, information can remain on the chip for up to 10 nanoseconds, which is long enough for it to be retrieved and processed.

“Building an acoustic buffer inside a chip improves our ability to control information by several orders of magnitude,” said Merklein. “Our system is not limited to a narrow bandwidth. So unlike previous systems this allows us to store and retrieve information at multiple wavelengths simultaneously, vastly increasing the efficiency of the device,” added Stiller.

The research has been published in Nature Communications for peer review and may usher in a new era of information dissemination at the speed of sound, as we continue searching for ways to go even faster and one day access information at true light speed ahead.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
08
Sep
2017

Equifax Hack Hits 143 Million Consumers

by Bill

It is now being widely reported that the credit reporting agency Equifax has been breeched and hackers may have obtained personal information, including the credit card and social security numbers, of as many as 143 Million consumers in what may be the largest financial security breach of all time.

“This is clearly a disappointing event for our company, and one that strikes at the heart of who we are and what we do,” said Chief Executive Officer Richard Smith of Equifax in the official statement by the company about the incident. “I apologize to consumers and our business customers for the concern and frustration this causes.”

In fairness, it is important to point out that even the best security systems in the world remain potential targets of hackers who tirelessly attempt to overcome the many barriers against entry. In what has become a constant game of cat and mouse, hosting companies and cybersecurity analysts continually develop new layers of protection while nefarious groups seek to worm their way in – and even if the good guys win 99.99999% of the time, even a single breach among thousands of attempts is deemed unacceptable by the media and the public.

Lost in much of the reporting is the widespread impact these kinds of events often have on digital commerce. They sometimes have an unfortunately chilling effect on consumer confidence, and on the merchant side they can create a devastating effect as cards are often reissued and rebills which would have continued abruptly come to a halt.

National Net will continue our work creating and implemented leading edge strategies to secure every bit and byte of data that passes through our servers, and we are always available to provide additional service and support if any incident does occur. Hopefully things like the quantum entanglement transmissions we reported on a few weeks ago will eventually create a truly impervious data network for everyone, and until then the best we can all do is to provide eternal vigilance as we work to preempt every attempt.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
01
Sep
2017

SanDisk Unveils Groundbreaking 400GB microSD Card

by Bill

Remember those 5 ¼” floppy disks back in the day? Single sided, double sided, and then the high-density version that packed a whopping 1.2 MB onto a single giant disc bigger than your whole hand? Now, just a few decades later, SanDisk has unveiled a groundbreaking new 400GB microSD card that is smaller than your fingertip and holds as much data as 333,333 of those old floppy discs!

The driving force behind all the R&D is the ever-expanding need for capacity, especially on mobile devices that are now becoming equipped with features like 4K video recording, and an undercurrent of distrust of cloud storage services due to hacks or widely reported governmental intrusions.

The 400GB SanDisk Ultra microSDXC UHS-I card was first made public on Thursday morning at the IFA 2017 conference, offering and impressive 144GB more storage than any other microSD card on the planet. It supports a blazing fast 100MB read speed, A1 app performance, and a UHS Speed Class 1 for the best performance available. So having all that extra space doesn’t come at the cost of reduced speed or increased load times.

Some tech magazines have lambasted the new card because it comes with a retail price of $249.99 on store shelves…. Which does make us wonder, how much would those tech reporters have been willing to pay back in the day to have more than 300K floppy discs of memory working at more than 1000x the access speed and taking up almost zero physical space? The price of progress isn’t always low, but it has proven itself time and again to be worth every penny.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
25
Aug
2017

The Constant in E = MC² May No Longer Apply to The Speed of Information

by Bill

One of the simplest, most elegant, most profound and far reaching of all human intellectual achievements was Einstein’s formulation of E = MC² to describe the fact that Energy is equal to Mass times the Speed of Light squared. The equation has lead to massive scientific advancements. It also posits the idea that the entire universe is restricted by one constant that never changes. Einstein suggested that the speed of light which is 2.99792458 108 m/s serves as a sort of universal speed limit that no particle can ever surpass, and countless experiments in the visible world seemed to back up that claim. However, in the incredibly microscopic world of quantum mechanics, things become much more fuzzy. Now, in data transmission, there are modern breakthroughs that suggest information need not obey Einstein’s formulations.

Using a principle known as Quantum Entanglement, scientists have discovered that two particles can be “paired” and that each will react to changes affecting the other instantaneously, even across huge distances. In a paper published in Science on June 16 the Chinese team reported that it had achieved its goal, measuring more than 1,000 pairs of photons to verify that they were indeed entangled, as predicted, and that team has gone on to make use of the entangled particles in a revolutionary new way of utilizing satellite communications.

The Quantum Science Satellite, nicknamed Micius or Mozi (Chinese: 墨子) was designed to establish a ‘Hack-Proof’ communications system of unimaginable speed and precision. Now initial tests are proving it actually works.

The key thing to keep in mind, is that with quantum entanglement data isn’t actually being sent anywhere or received anywhere. Instead, the paired particles are altered in one location and instantaneously become altered in exactly the same way in another location. That allows for the data to exist in two places at once without the need for anyone to send it anywhere… because it is already there the moment they create it.

Many publications are touting this new technology as a way to move data without any possibility of it being snooped or hacked by a third party, since there is no actual transmission. Yes, that is fundamentally accurate and impressive, but far too few are taking note of the fact that this also means data can be “sent” anywhere without speed even being a factor. As one example, if we sent a human crew on a mission to Mars, any traditional communication they would send to Earth would take at least 12.5 minutes to arrive because Mars is 12.5 Light Minutes away from Earth. Up until now that delay was not something that could be overcome. Planets further away or in other solar systems would require years of delay in communications… but if we sent that same crew with entangled particles, they would theoretically speak to people here in Earth with zero latency.

Now apply that same logic to modern computing and it becomes easy to see why everyone from High Frequency Traders on Wall Street to video game creators and ad networks are excited by the prospect of immediate information without the need to send data to or from anywhere ever again!

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
23
Aug
2017

FCC Transparency Called Into Question

by Bill

AJIT Pai, the Chairman of the Federal Communications Commission said in February that he wanted the agency to be “as open and accessible as possible to the American people.” Now people are calling that statement into question due to some key information that has become surprisingly hard to acquire.

The FCC’s recent handling of complaints from the public about internet providers and the still murky cause(s) of a May 7th outage of the public comments section of the FCC’s own website are garnering interest from politicians and the public.

“Chairman Pai promised to make the FCC more transparent, but the early returns aren’t looking good,” says U.S. Senator Ron Wyden (D-Oregon), in a statement. “The FCC seems more concerned with helping Big Cable than living up to his promise.”

Pai declined to be interviewed with Wired Magazine about the issues but a spokesperson told the magazine that the chair “is proud of the transparency measures he has instituted at the FCC.”

Still complaints persist about a lack of transparency at the FCC regarding the commission’s stated plan to reverse some of its own net-neutrality rules, which have always prohibited internet providers from favoring some forms of traffic over others.

The FCC has stated that it received only one formal complaint about the shift in policy, but fails to mention that the agency received more than 47,000 informal complaints about net-neutrality violations since the rules took effect in 2015. That’s significant because a formal complaint costs $225 to file, and often require lawyers, procedural rules, and written pleadings. Informal complaints can be filed online for free with a simple online form. Accurate accounting of actual complaints continues to be elusive and on July 26th, the American Oversight sued to obtain the records, but again the FCC declined to comment on the suit.

A May 7th outage of the commission’s public comment system followed a segment of the television show Last Week Tonight with John Oliver, in which the host asked viewers to file comments about net neutrality. The next day, the FCC blamed the outage on a cyber attack saying: “Our analysis reveals that the FCC was subject to multiple distributed denial-of-service attacks,” according to FCC chief information officer David Bray in a statement published on May 8th. “These were deliberate attempts by external actors to bombard the FCC’s comment system with a high amount of traffic.”

However, Journalist Kevin Collier filed suit against the agency after it did not respond to his April 26 FOIA request. The FCC told tech news site Gizmodo it had no records prior to Bray’s statement related to the “analysis” he referenced.

It becomes increasingly difficult for an agency like the FCC is claim net neutrality is not in the public interest, when clearly the public keeps saying it is… and the FCC has decided to stop accurately reporting what the public has been saying all along.

We will continue to monitor this story and the political chatter regarding net neutrality as it directly affects our clients and the Internet in a broader sense. When regulations change, business strategies must adapt and National Net is always about putting our clients first, with as much foresight as we can muster to assist in delicate process of staying ready for whatever comes down the pipe… transparently announced or otherwise.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
10
Aug
2017

Facebook Forced To Shut Down AI That Created Its Own Language

by Bill

As computers becoming smarter people may mistake the speed of calculations with the entirely different set of parameters that make up what researchers refer to as artificial intelligence. AI is not about the speed or processing power precisely, it’s about the possibility of a machine learning to think well enough to develop its own creative solutions and eventually to think of things that its human creators were unable to come up with on their own. Facebook recently built an AI that did exactly that, and its human overlords became so frightened by the result that they immediately pulled the plug on the project…. for now.

As was widely reported, Facebook needed to pull the plug on their artificial intelligence system because it accomplished what they wanted and was immediately deemed to be too far out of hand. The systems were created to talk to each other and make trades with one another. When they began throwing what researched assumed to be nonsense statements to each other, but ended up making trades based on those statements, it became clear that the machines had stopped using English and started using a language that they created on their own: a language that their creators were entirely unable to comprehend.

Bob: “I can can I I everything else.”

Alice: “Balls have zero to me to me to me to me to me to me to me to me to.”

The above passages may make no sense to humans, but they are an actual conversation that happened between two AI agents. The AI agents, talked to each other using plain English, but eventually negotiated in this new language that only the AI systems understood.

The implications are obvious and serious. First, Facebook did manage to create machine AI systems capable of thought, and perhaps far more importantly, this result shows that when left to their own devices, machine AI will seek to answer questions or solve problems other than the ones its creators task them with at the start.

Will this be the way human’s cure cancer? Will it eventually be the way machines learn to eradicate water (which has long been the enemy of anything electronic), or will researchers somehow find ways to safeguard their seems while creating machines smarter than themselves?

Given the fact that the goal is to make a system smarter than the person who created it, it stands to reason that we are all running out of time before machine generated malware finds a way to establish the primacy of new apex predators in a radically new age.

As leading technologist and Tesla CEO Elon Musk recently said earlier this month at the National Governors Association Summer Meeting in Rhode Island: “I have exposure to the very most cutting-edge AI, and I think people should be really concerned about it

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
31
Jul
2017

Netflix Engineers Devise, Deploy, Test And Solve A Rare DDoS Attack

by Bill

Netflix security engineers recently devised and ran a rare kind of DDoS attack on their own infrastructure as a test of the streaming system’s security measures. They brought the whole site down, proved Netflix was vulnerable to the unorthodox type of distributed denial of service attack and solved the problem for their own site while open sourcing the solution for others. As hackers begin colluding on ways to damage their targets, this new era of cooperation among security professionals is leveling the battle field and allowing hosts to resolve attacks faster than previously possible by sharing their findings.

Normally, a DDoS strike floods a website with junk traffic requests from IoT devices, overwhelming the system with a limitless stream of requests. Netflix, is built to handle more than 35TB per second of data during peak hours, and has a network of Open Connect devices making is very difficult target for traditional DDoS attacks.

The newly DDoS turned Netflix’s application programming interface against itself. Netflix realized an attacker could send resource-intensive, carefully chosen requests to trigger more and more requests internally causing a cascade of data deep in the system. In this way, an attacker could easily and cheaply cause significant resource burden, or even take Netflix down.

As site owners and business continue integrating each other’s services via API and other measures, that interconnectivity itself was becoming a prime target for attacks. No service is ever entirely safe from malicious attackers, but thanks to Netflix the Internet is safer than it had been from these kinds of DDoS proxy attacks on data requesting internal services.

The evolution of attacker strategies never ends, but as  companies like Netflix, Hosts like National Net and many others in the digital data community continue to work together, protecting against these types of application DDoS assaults, and so many other present data dangers gets a fair bit easier and lot faster to implement. Stay tuned for continued coverage of these important developments as National Net continues to work diligently to keep every client’s servers online with perfect up time.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
10
Jul
2017

Welcome To The Splinternet

by Bill

Since its inception, the Internet has always had a centralized structure in place to govern its technical aspects, and a layer of regulations on the kinds of content deemed acceptable by entities including Visa and law enforcement agencies. While the people using the Internet are as diverse as the entire world population, the structure of the way the Internet has been moderated was always a homogenizing factor that brought distant netizens closer together under a single umbrella. That paradigm makes commerce far simpler, and social communication nearly frictionless regardless of national borders… but it is now changing rapidly.

Antitrust regulations in Europe and the “right to be forgotten” are already vastly different concepts than anything legislated by the United States so far. Now many additional fragments are beginning to become the regional rule of law as well. On June 30, Germany passed a law ordering all social media companies operating in Germany to delete hate speech within 24 hours of it being posted, or face fines of up to 57 Million Dollars per instance. There is also a recent Canadian Supreme Court ruling that Google must scrub search results about pirated products, along with a May court ruling in Austria that Facebook must take down specific posts that were considered hateful toward the country’s Green party leader.

Add in the United States acquiescence over controlling ICANN as the domain regulatory body, and several other rulings or new legal orders that are starting to contort content to the local ethics and mores of each community – and what you wind up with is a Splinternet that forces businesses to navigate fast changing wide sweeping hurdles that were never part of the Internet until now.

As online billing, hosting, and technical deployment continue to become increasingly specialized services, compliance with these new local ordinances is also becoming a part of what we do here at NationalNet. We understand that our clients do business everywhere a sale is possible, and we will continue to do all we can to ensure your managed servers are fully compliant with whatever the local rule of law requires now, and into the future. If you have any questions, be sure to give us a call so we can further assist you.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza