888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus
08
Sep
2017

Equifax Hack Hits 143 Million Consumers

by Bill

It is now being widely reported that the credit reporting agency Equifax has been breeched and hackers may have obtained personal information, including the credit card and social security numbers, of as many as 143 Million consumers in what may be the largest financial security breach of all time.

“This is clearly a disappointing event for our company, and one that strikes at the heart of who we are and what we do,” said Chief Executive Officer Richard Smith of Equifax in the official statement by the company about the incident. “I apologize to consumers and our business customers for the concern and frustration this causes.”

In fairness, it is important to point out that even the best security systems in the world remain potential targets of hackers who tirelessly attempt to overcome the many barriers against entry. In what has become a constant game of cat and mouse, hosting companies and cybersecurity analysts continually develop new layers of protection while nefarious groups seek to worm their way in – and even if the good guys win 99.99999% of the time, even a single breach among thousands of attempts is deemed unacceptable by the media and the public.

Lost in much of the reporting is the widespread impact these kinds of events often have on digital commerce. They sometimes have an unfortunately chilling effect on consumer confidence, and on the merchant side they can create a devastating effect as cards are often reissued and rebills which would have continued abruptly come to a halt.

National Net will continue our work creating and implemented leading edge strategies to secure every bit and byte of data that passes through our servers, and we are always available to provide additional service and support if any incident does occur. Hopefully things like the quantum entanglement transmissions we reported on a few weeks ago will eventually create a truly impervious data network for everyone, and until then the best we can all do is to provide eternal vigilance as we work to preempt every attempt.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
01
Sep
2017

SanDisk Unveils Groundbreaking 400GB microSD Card

by Bill

Remember those 5 ¼” floppy disks back in the day? Single sided, double sided, and then the high-density version that packed a whopping 1.2 MB onto a single giant disc bigger than your whole hand? Now, just a few decades later, SanDisk has unveiled a groundbreaking new 400GB microSD card that is smaller than your fingertip and holds as much data as 333,333 of those old floppy discs!

The driving force behind all the R&D is the ever-expanding need for capacity, especially on mobile devices that are now becoming equipped with features like 4K video recording, and an undercurrent of distrust of cloud storage services due to hacks or widely reported governmental intrusions.

The 400GB SanDisk Ultra microSDXC UHS-I card was first made public on Thursday morning at the IFA 2017 conference, offering and impressive 144GB more storage than any other microSD card on the planet. It supports a blazing fast 100MB read speed, A1 app performance, and a UHS Speed Class 1 for the best performance available. So having all that extra space doesn’t come at the cost of reduced speed or increased load times.

Some tech magazines have lambasted the new card because it comes with a retail price of $249.99 on store shelves…. Which does make us wonder, how much would those tech reporters have been willing to pay back in the day to have more than 300K floppy discs of memory working at more than 1000x the access speed and taking up almost zero physical space? The price of progress isn’t always low, but it has proven itself time and again to be worth every penny.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
25
Aug
2017

The Constant in E = MC² May No Longer Apply to The Speed of Information

by Bill

One of the simplest, most elegant, most profound and far reaching of all human intellectual achievements was Einstein’s formulation of E = MC² to describe the fact that Energy is equal to Mass times the Speed of Light squared. The equation has lead to massive scientific advancements. It also posits the idea that the entire universe is restricted by one constant that never changes. Einstein suggested that the speed of light which is 2.99792458 108 m/s serves as a sort of universal speed limit that no particle can ever surpass, and countless experiments in the visible world seemed to back up that claim. However, in the incredibly microscopic world of quantum mechanics, things become much more fuzzy. Now, in data transmission, there are modern breakthroughs that suggest information need not obey Einstein’s formulations.

Using a principle known as Quantum Entanglement, scientists have discovered that two particles can be “paired” and that each will react to changes affecting the other instantaneously, even across huge distances. In a paper published in Science on June 16 the Chinese team reported that it had achieved its goal, measuring more than 1,000 pairs of photons to verify that they were indeed entangled, as predicted, and that team has gone on to make use of the entangled particles in a revolutionary new way of utilizing satellite communications.

The Quantum Science Satellite, nicknamed Micius or Mozi (Chinese: 墨子) was designed to establish a ‘Hack-Proof’ communications system of unimaginable speed and precision. Now initial tests are proving it actually works.

The key thing to keep in mind, is that with quantum entanglement data isn’t actually being sent anywhere or received anywhere. Instead, the paired particles are altered in one location and instantaneously become altered in exactly the same way in another location. That allows for the data to exist in two places at once without the need for anyone to send it anywhere… because it is already there the moment they create it.

Many publications are touting this new technology as a way to move data without any possibility of it being snooped or hacked by a third party, since there is no actual transmission. Yes, that is fundamentally accurate and impressive, but far too few are taking note of the fact that this also means data can be “sent” anywhere without speed even being a factor. As one example, if we sent a human crew on a mission to Mars, any traditional communication they would send to Earth would take at least 12.5 minutes to arrive because Mars is 12.5 Light Minutes away from Earth. Up until now that delay was not something that could be overcome. Planets further away or in other solar systems would require years of delay in communications… but if we sent that same crew with entangled particles, they would theoretically speak to people here in Earth with zero latency.

Now apply that same logic to modern computing and it becomes easy to see why everyone from High Frequency Traders on Wall Street to video game creators and ad networks are excited by the prospect of immediate information without the need to send data to or from anywhere ever again!

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
23
Aug
2017

FCC Transparency Called Into Question

by Bill

AJIT Pai, the Chairman of the Federal Communications Commission said in February that he wanted the agency to be “as open and accessible as possible to the American people.” Now people are calling that statement into question due to some key information that has become surprisingly hard to acquire.

The FCC’s recent handling of complaints from the public about internet providers and the still murky cause(s) of a May 7th outage of the public comments section of the FCC’s own website are garnering interest from politicians and the public.

“Chairman Pai promised to make the FCC more transparent, but the early returns aren’t looking good,” says U.S. Senator Ron Wyden (D-Oregon), in a statement. “The FCC seems more concerned with helping Big Cable than living up to his promise.”

Pai declined to be interviewed with Wired Magazine about the issues but a spokesperson told the magazine that the chair “is proud of the transparency measures he has instituted at the FCC.”

Still complaints persist about a lack of transparency at the FCC regarding the commission’s stated plan to reverse some of its own net-neutrality rules, which have always prohibited internet providers from favoring some forms of traffic over others.

The FCC has stated that it received only one formal complaint about the shift in policy, but fails to mention that the agency received more than 47,000 informal complaints about net-neutrality violations since the rules took effect in 2015. That’s significant because a formal complaint costs $225 to file, and often require lawyers, procedural rules, and written pleadings. Informal complaints can be filed online for free with a simple online form. Accurate accounting of actual complaints continues to be elusive and on July 26th, the American Oversight sued to obtain the records, but again the FCC declined to comment on the suit.

A May 7th outage of the commission’s public comment system followed a segment of the television show Last Week Tonight with John Oliver, in which the host asked viewers to file comments about net neutrality. The next day, the FCC blamed the outage on a cyber attack saying: “Our analysis reveals that the FCC was subject to multiple distributed denial-of-service attacks,” according to FCC chief information officer David Bray in a statement published on May 8th. “These were deliberate attempts by external actors to bombard the FCC’s comment system with a high amount of traffic.”

However, Journalist Kevin Collier filed suit against the agency after it did not respond to his April 26 FOIA request. The FCC told tech news site Gizmodo it had no records prior to Bray’s statement related to the “analysis” he referenced.

It becomes increasingly difficult for an agency like the FCC is claim net neutrality is not in the public interest, when clearly the public keeps saying it is… and the FCC has decided to stop accurately reporting what the public has been saying all along.

We will continue to monitor this story and the political chatter regarding net neutrality as it directly affects our clients and the Internet in a broader sense. When regulations change, business strategies must adapt and National Net is always about putting our clients first, with as much foresight as we can muster to assist in delicate process of staying ready for whatever comes down the pipe… transparently announced or otherwise.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
10
Aug
2017

Facebook Forced To Shut Down AI That Created Its Own Language

by Bill

As computers becoming smarter people may mistake the speed of calculations with the entirely different set of parameters that make up what researchers refer to as artificial intelligence. AI is not about the speed or processing power precisely, it’s about the possibility of a machine learning to think well enough to develop its own creative solutions and eventually to think of things that its human creators were unable to come up with on their own. Facebook recently built an AI that did exactly that, and its human overlords became so frightened by the result that they immediately pulled the plug on the project…. for now.

As was widely reported, Facebook needed to pull the plug on their artificial intelligence system because it accomplished what they wanted and was immediately deemed to be too far out of hand. The systems were created to talk to each other and make trades with one another. When they began throwing what researched assumed to be nonsense statements to each other, but ended up making trades based on those statements, it became clear that the machines had stopped using English and started using a language that they created on their own: a language that their creators were entirely unable to comprehend.

Bob: “I can can I I everything else.”

Alice: “Balls have zero to me to me to me to me to me to me to me to me to.”

The above passages may make no sense to humans, but they are an actual conversation that happened between two AI agents. The AI agents, talked to each other using plain English, but eventually negotiated in this new language that only the AI systems understood.

The implications are obvious and serious. First, Facebook did manage to create machine AI systems capable of thought, and perhaps far more importantly, this result shows that when left to their own devices, machine AI will seek to answer questions or solve problems other than the ones its creators task them with at the start.

Will this be the way human’s cure cancer? Will it eventually be the way machines learn to eradicate water (which has long been the enemy of anything electronic), or will researchers somehow find ways to safeguard their seems while creating machines smarter than themselves?

Given the fact that the goal is to make a system smarter than the person who created it, it stands to reason that we are all running out of time before machine generated malware finds a way to establish the primacy of new apex predators in a radically new age.

As leading technologist and Tesla CEO Elon Musk recently said earlier this month at the National Governors Association Summer Meeting in Rhode Island: “I have exposure to the very most cutting-edge AI, and I think people should be really concerned about it

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
31
Jul
2017

Netflix Engineers Devise, Deploy, Test And Solve A Rare DDoS Attack

by Bill

Netflix security engineers recently devised and ran a rare kind of DDoS attack on their own infrastructure as a test of the streaming system’s security measures. They brought the whole site down, proved Netflix was vulnerable to the unorthodox type of distributed denial of service attack and solved the problem for their own site while open sourcing the solution for others. As hackers begin colluding on ways to damage their targets, this new era of cooperation among security professionals is leveling the battle field and allowing hosts to resolve attacks faster than previously possible by sharing their findings.

Normally, a DDoS strike floods a website with junk traffic requests from IoT devices, overwhelming the system with a limitless stream of requests. Netflix, is built to handle more than 35TB per second of data during peak hours, and has a network of Open Connect devices making is very difficult target for traditional DDoS attacks.

The newly DDoS turned Netflix’s application programming interface against itself. Netflix realized an attacker could send resource-intensive, carefully chosen requests to trigger more and more requests internally causing a cascade of data deep in the system. In this way, an attacker could easily and cheaply cause significant resource burden, or even take Netflix down.

As site owners and business continue integrating each other’s services via API and other measures, that interconnectivity itself was becoming a prime target for attacks. No service is ever entirely safe from malicious attackers, but thanks to Netflix the Internet is safer than it had been from these kinds of DDoS proxy attacks on data requesting internal services.

The evolution of attacker strategies never ends, but as  companies like Netflix, Hosts like National Net and many others in the digital data community continue to work together, protecting against these types of application DDoS assaults, and so many other present data dangers gets a fair bit easier and lot faster to implement. Stay tuned for continued coverage of these important developments as National Net continues to work diligently to keep every client’s servers online with perfect up time.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
10
Jul
2017

Welcome To The Splinternet

by Bill

Since its inception, the Internet has always had a centralized structure in place to govern its technical aspects, and a layer of regulations on the kinds of content deemed acceptable by entities including Visa and law enforcement agencies. While the people using the Internet are as diverse as the entire world population, the structure of the way the Internet has been moderated was always a homogenizing factor that brought distant netizens closer together under a single umbrella. That paradigm makes commerce far simpler, and social communication nearly frictionless regardless of national borders… but it is now changing rapidly.

Antitrust regulations in Europe and the “right to be forgotten” are already vastly different concepts than anything legislated by the United States so far. Now many additional fragments are beginning to become the regional rule of law as well. On June 30, Germany passed a law ordering all social media companies operating in Germany to delete hate speech within 24 hours of it being posted, or face fines of up to 57 Million Dollars per instance. There is also a recent Canadian Supreme Court ruling that Google must scrub search results about pirated products, along with a May court ruling in Austria that Facebook must take down specific posts that were considered hateful toward the country’s Green party leader.

Add in the United States acquiescence over controlling ICANN as the domain regulatory body, and several other rulings or new legal orders that are starting to contort content to the local ethics and mores of each community – and what you wind up with is a Splinternet that forces businesses to navigate fast changing wide sweeping hurdles that were never part of the Internet until now.

As online billing, hosting, and technical deployment continue to become increasingly specialized services, compliance with these new local ordinances is also becoming a part of what we do here at NationalNet. We understand that our clients do business everywhere a sale is possible, and we will continue to do all we can to ensure your managed servers are fully compliant with whatever the local rule of law requires now, and into the future. If you have any questions, be sure to give us a call so we can further assist you.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
26
Jun
2017

The IoT Is Coming Whether Data Security Is Ready or Not

by Bill

According to a fascinating study in May of 553 IT decision makers, 78% said they thought it was at least somewhat likely their business will suffer data loss or theft due to the advancement of IoT (Internet of Things) devices like smoke detectors, cameras and home appliances. Furthermore, 72% said the speed of IoT advancement makes it difficult to keep up with the rapid rate of change now seen throughout a long list of evolving security requirements.

Meanwhile. Gartner predicts that by 2020 there will already be 21 billion IoT devices in existence, which is an amazing rise from just over 5 billion active devices in 2015. Furthermore, they expect 8 billion of those devices to be industrial, which in many ways makes them an even easier target for mass hacking campaigns or malware attacks.

The important takeaway from all of this is that the IoT revolution is here now and it is not showing any signs of slowing down. The speed of adoption by consumers is a much stronger force for change than the desire of security experts for the status quo or for a more measured roll out of these new products to avoid vulnerabilities.

That conflict and the eventuality that the world will soon have many more IoT devices active than it can safeguard is an important reason to ensure you have a clearly defined, robust and secure data recovery plan in place for all aspects of your online existence. Routine backups, offline ‘hard backups’, continually monitoring and a vigilant eye toward data security best practices may soon be the difference between having a brief bump in the road, and being entirely out of business for an extended period of time.

As always, NationalNet is here to provide all of our clients with fully managed hosting services that include many security measures designed to make you less likely to be knocked off line, more likely to keep your data secure, and allowing your sites to get back online after an attack as quickly as technologically possible. It’s not quite as reassuring as it would be to say the IoT is going away, but we live in the real world where forward thinking business owners must accept hard truths, adapt to challenges and overcome all obstacles on the road to their continued success.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
19
Jun
2017

Executive Order Redefining Cybersecurity Protocols and Government Contracts

by Bill

A detailed report by security experts at Arnold & Porter Kaye Scholer LLP published today takes a deep informative look at the executive order signed into effect on June 15, 2017 entitled “Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure.” The white paper sheds lights on many aspects of digital security that will be impacted by the changes in policy and also explain important guidelines for obtaining government contracts as a private business interested in cashing in on what is sure to be a lucrative market for upgrades throughout the governmental computer networks.

As the report details “A look at the main topics addressed in the Executive Order, reveals who from industry should be interested in the direction of the reports: (a) IT modernization and shared services affects IT companies, service providers (including Fintech), managed security services, cloud providers, AI firms, systems integrators, small innovators, foreign IT companies, and others; (b) cybersecurity for critical infrastructure affects the owners of Section 9 Entities and members of the defense industrial base, including its supply chain, platforms, and systems;

(c) “transparency” affects all public companies; (d) the section on “botnets,” by its terms, affects the entire “internet and communications ecosystem” (an expansive definition both vertically and horizontally); (e) the “electric subsector” concerns generation, transmission, distribution, and alternative energy; (f) deterrence and international cooperation affects all multinational companies; and (g) the section on the workforce affects traditional academia (universities and K-12 institutions), for-profit schools, and any company that benefits from a cyber skilled workforce.”

Any commercial company or private citizen interested in the ways our national security will be implemented online, or seeking to provide services in that regard should make time to read this white paper. It is extensive and provides an excellent explanation of what is now being rolled out online.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
09
Jun
2017

Is Net Neutrality Just A Nice Slogan? The FCC And A US Senator Think So

by Bill

“It’s a great slogan,” said FCC Chairman Ajit Pai when h was asked by a radio host what net neutrality is all about. “But in reality what it involves is Internet regulation, and the basic question is, ‘Do you want the government deciding how the Internet is run?'”

The fact that the FCC Chairman called net neutrality a “slogan” and suggested it solves no real problems, was later bolstered by the statements of Sen. Ron Johnson (R-Wis.) on WTMJ Radio in Milwaukee on Monday when he argued that the Internet should have paid “fast lanes” for some content providers. Johnson went on to explain: “As chairman Pai said, net neutrality is a slogan. What you really want is an expansion of high-speed broadband, and in order to do that you have to create the incentives for those smaller ISPs to invest. They don’t really control their own fiber if the government tells them exactly how they’re going to use their investment.” He also brought out an often debunked argument that “there’s less incentive to invest, so we’ll have less high-speed broadband” if net neutrality regulations are maintained.

“Chairman Pai just mentioned medical diagnostics,” Johnson said. “You might need a fast lane within that pipeline so those diagnoses can be transmitted instantaneously and not be held up by, I don’t know, maybe a movie streaming.”

It’s very interesting from a policy point of view that on a tactical level the ground is quickly shifting as opponents of net neutrality are beginning to argue that having it isn’t a big deal, rather than arguing whether it should exist on the merits. Some of the arguments being used have already been shown to be patently false, but echoing those statements about net neutrality being a meaningless policy across FCC agency and Legislative news outlets appears to be having a significant impact on the way the discussion is being framed.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza