888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus
06
Nov
2017

US Federal Judge Rejects Global De-Indexing Order By Canada’s Highest Court

by Bill

A recent Canadian court order demands that Google “de-index” all pages relevant to a company named Datalink, which seems to have sold products intended to violate the IP of the Vancouver-based company Equustek. The order would require Google to alter global search results. Google filed suit in US federal court seeking an order to make the Canadian ruling unenforceable within the United States. Now, US District Judge Edward Davila granted a preliminary injunction stopping Equustek’s Canadian order from being enforced anywhere in the USA. Davila found that the order violates Section 230 of the Communications Decency Act, which is intended to prevent any online platform from being held responsible for any content posted by any user.

Google claimed the Plaintiffs “never established any violation of their rights under US law.” Making the matter even easier for Google to win, Equustek never showed up to defend itself in US court case and allowed the case to be entirely one-sided.

The injunction against the Canadian order not only protects Google but also will “serve the public interest,” Davila held. “[T]he Canadian order would hold Google liable as the ‘publisher or speaker’ of the information on Datalink’s websites… By forcing intermediaries to remove links to third-party material, the Canadian order undermines the policy goals of Section 230 and threatens free speech on the global Internet.”

This new ruling reaffirms the position of US courts and legislators regarding claims of online digital piracy by third-parties. A contentious issue for years, now with a new ruling that is not likely to reduce passions on either side of the debate

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
01
Nov
2017

AT&T Opens Up It’s New Artificial Intelligence Platform Acumos

by Bill

Google, Amazon, and Microsoft have already released frameworks designed to get developers building AI-powered applications on their platforms. AT&T just entered the fray in a big way with their own new AI platform called Acumos, revealed at a Dallas event today.

Now anyone using the Google TensorFlow framework to create machine-learning tools will be able to use Acumos as both a directory for sharing AI models and an ordered system usable to customize or connect independent models in useful ways.

AT&T is developing the Acumos  in conjunction with Tech Mahindra. The underlying code is hosted by the Linux Foundation, and is entirely open-source. However, that’s while AT&T is giving away the platform and the Linux Foundation will also host a public version of Acumos that anyone can use, independent organizations can also choose to create private versions of Acumos as well.

While the media focuses on the wonders of Virtual Reality as an entertainment medium… the big tech players are much more interested in the limitless possibilities of true artificial intelligence as problem-solving methodology that is quickly taking shape. National Net remains actively interested and aware of these new technologies because we view it as our responsibility to provide all of our clients with the best tools on the backend and front end to compete with any other product or service online. Whether that is through your own intelligence, business intelligence you gather or the artificial intelligence we may soon provide!

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
23
Oct
2017

Federal Regulators Propose Honest Online Ads Law

by Bill

Action seems to be on the horizon as a bipartisan group of federal regulators says it wants to impede Russia attempts to influence US elections. The group is proposing new legislation that would require companies including: Google, Twitter, and Facebook to fully disclose who is buying political advertisements on each platform and continue to maintain those records long after elections are over.

The Honest Ads Act would initially require Internet companies to follow similar rules for political advertising that now apply for TV, Radio, and Print media Facebook recently announced it found roughly 500 “inauthentic” accounts linked to Russian subversion via posts and ad purchases.

However, as Bloomberg News was quick to point out an Internet troll could easily mask their location and intentions to subvert the new law anyway. Jane Doe from a random location with a localized phone number would be able to buy any political ad they wanted, even if they were actually in Moscow working for the Russian government. The transaction would be recorded but would just show the fake name and other bogus user information.

“Unfortunately, US laws requiring transparency in political campaigns have not kept pace with rapid advances in technology,” said Sen. John McCain (R-Ariz.) in a statement. “Allowing our adversaries to take advantage of these loopholes to influence millions of American voters with impunity.”

“What we want to try to do is start with a light touch,” added Sen. Mark Warner (D-Va.) of the proposed bill.

The weakness of the measure in the real world, and implications of expanding these regulations to non-political ads is also a cause of concern. “So when I want to buy an ad on Facebook to sell a colon cleanse… now they have to report all my personal information to the government and store those records?” said one prominent online marketer we spoke with who asked to be anonymous. “It’s just another overreach and invasion of privacy, putting commercial entities in the middle and demanding they spy on American citizens under the auspices of national security… without any real hope of ancient people in Congress crafting a law that would actually protect our elections.”

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
10
Oct
2017

Literally Every Yahoo Account Ever Hacked

by Bill

When a company has a major hack, you expect a few things. Probably, some accounts have been compromised, but maybe not all. Secondly, you may expect not all the information will be available right away. Yahoo has a lot to answer for, because it recently revealed a 2013 attack on their servers which was previously thought to have compromised about 1 billion accounts, but now the facts are becoming available and in actuality all 3 billion accounts in existence at the time. This massive revision of how many accounts were affected is very important for consumers, and also is significant because Yahoo was in discussions to be bought out by Verizon in the interim.

The information revealed may not be quite as serious as the sensitive financial data revealed during the most recent Equifax breach, but it is quite serious because it includes things like e-mail addresses, phone numbers, and passwords which can be used illegally in conjunction with the Equifax data to compromise the identities of millions of people globally.

This rash of hacks has many wondering what the point is of shredding old bank statements and phone bills before discarding them, when it is now so easy for nefarious parties to simple gather your hacked data digitally.

With this climate of hacking, it’s important to work with companies that you truly trust. You can count on NationalNet to provide you with hosting that will keep your data as secure as modern technology allows. Much is being done to research new forms of data security as previously reported here on the NationalNet Blog, including China’s quantum entanglement plans. As new tools become available on the commercial level, we will continue to implement them quickly… hopefully Yahoo and the credit bureaus will also do so as well.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
29
Sep
2017

Google Opens Up Search Ad Buys To Rivals With Little Hope Of Improvements

by Bill

Until now, Google has reserved spots on particular search times for itself, but now it has promised to open up bidding on those terms to rivals at the top of product search results in Europe. Google announced the change in response to a June ruling by the European Commission, which found in part that Google had systemically abused its dominance in search to gain competitive advantages. This is one of the three cases pending against Google in Europe that may lead to sweeping Antitrust regulatory action in the EU and beyond if Google fails to impress regulators with its remediation actions.

The June ruling by the commission ordered Google to pay €2.42 billion ($2.8 billion dollars) in fines, but analysts agree that one time expense is nothing compared to the longer term threat to their business model if regulators persist. For example, Paul Gallant, an analyst with Cowen, is quotes saying that Google is likely concerned that the order could also be applied to other verticals including travel and local search.

To strengthen its legal footing with EU regulators, Google also said Wednesday that Google Shopping (which is the core of the case) will start operating as a standalone unit in Europe, bidding against all companies for featured spots to help level the playing field, but experts and insiders are saying that change is far from being enough of a move to actually open up real competition.

As Maurice Stucke, a law professor and the cofounder of the Konkurrenz Group, explained, “Imagine if a company became a monopoly by burning down all the warehouses and factories of its rivals. If regulators instruct the company to stop burning things down, the monopoly will say, ‘Fine I can live with that. I promise not to burn down any more factories.” Yet clearly, that’s not a solution to the problem that persists.

It will be interesting to see which way these cases are resolved, with the balance of millions of search terms and billions or trillions of dollars at stake.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
22
Sep
2017

Light Has Been Stored as Sound For The First Time

by Bill

Scientists have discovered a way to store light-based information as sound waves on a computer chip. Why does that matter? The conversion is critical to shifting away from the inefficient electronic computers currently in use to a light-based computer that moves data at a much faster velocity than anything on the market right now.

Light-based photonic computers have the potential to run at least 20 times faster in theory, and they won’t produce heat or require anywhere near as much energy as existing devices. By processing data in the form of photons rather than electrons IBM, Intel and others have been seeking a method of slowing the data down from its light form so that the processing power of modern computers would be sufficient to handle it all.

Coding information into photons is surprisingly easy, as we do when we send information via optical fiber – however; retrieving information at the speed of light and then processing it is not yet possible because it’s simply too fast.  This new alternative slows down the light and converts it into sound, allowing researchers from the University of Sydney in Australia to access data at a speed far greater electronic computing, though significantly slower than the speed of light.

“The information in our chip in acoustic form travels at a velocity five orders of magnitude slower than in the optical domain,” said project supervisor Birgit Stiller. However, that also means that computers can still achieve amazingly high speeds with no heat caused by electronic resistance, and no interference from electromagnetic radiation.

“This is an important step forward in the field of optical information processing as this concept fulfills all requirements for current and future generation optical communication systems,” added team member Benjamin Eggleton.

First, photonic information enters the chip as a pulse of light, where it interacts with a ‘write’ pulse, producing an acoustic wave that stores the data. Another pulse of light, called the ‘read’ pulse, then accesses this sound data and transmits as light once more pulse of data.  While unimpeded light will pass through the chip in 2 to 3 nanoseconds, once stored as a sound wave, information can remain on the chip for up to 10 nanoseconds, which is long enough for it to be retrieved and processed.

“Building an acoustic buffer inside a chip improves our ability to control information by several orders of magnitude,” said Merklein. “Our system is not limited to a narrow bandwidth. So unlike previous systems this allows us to store and retrieve information at multiple wavelengths simultaneously, vastly increasing the efficiency of the device,” added Stiller.

The research has been published in Nature Communications for peer review and may usher in a new era of information dissemination at the speed of sound, as we continue searching for ways to go even faster and one day access information at true light speed ahead.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
08
Sep
2017

Equifax Hack Hits 143 Million Consumers

by Bill

It is now being widely reported that the credit reporting agency Equifax has been breeched and hackers may have obtained personal information, including the credit card and social security numbers, of as many as 143 Million consumers in what may be the largest financial security breach of all time.

“This is clearly a disappointing event for our company, and one that strikes at the heart of who we are and what we do,” said Chief Executive Officer Richard Smith of Equifax in the official statement by the company about the incident. “I apologize to consumers and our business customers for the concern and frustration this causes.”

In fairness, it is important to point out that even the best security systems in the world remain potential targets of hackers who tirelessly attempt to overcome the many barriers against entry. In what has become a constant game of cat and mouse, hosting companies and cybersecurity analysts continually develop new layers of protection while nefarious groups seek to worm their way in – and even if the good guys win 99.99999% of the time, even a single breach among thousands of attempts is deemed unacceptable by the media and the public.

Lost in much of the reporting is the widespread impact these kinds of events often have on digital commerce. They sometimes have an unfortunately chilling effect on consumer confidence, and on the merchant side they can create a devastating effect as cards are often reissued and rebills which would have continued abruptly come to a halt.

National Net will continue our work creating and implemented leading edge strategies to secure every bit and byte of data that passes through our servers, and we are always available to provide additional service and support if any incident does occur. Hopefully things like the quantum entanglement transmissions we reported on a few weeks ago will eventually create a truly impervious data network for everyone, and until then the best we can all do is to provide eternal vigilance as we work to preempt every attempt.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
01
Sep
2017

SanDisk Unveils Groundbreaking 400GB microSD Card

by Bill

Remember those 5 ¼” floppy disks back in the day? Single sided, double sided, and then the high-density version that packed a whopping 1.2 MB onto a single giant disc bigger than your whole hand? Now, just a few decades later, SanDisk has unveiled a groundbreaking new 400GB microSD card that is smaller than your fingertip and holds as much data as 333,333 of those old floppy discs!

The driving force behind all the R&D is the ever-expanding need for capacity, especially on mobile devices that are now becoming equipped with features like 4K video recording, and an undercurrent of distrust of cloud storage services due to hacks or widely reported governmental intrusions.

The 400GB SanDisk Ultra microSDXC UHS-I card was first made public on Thursday morning at the IFA 2017 conference, offering and impressive 144GB more storage than any other microSD card on the planet. It supports a blazing fast 100MB read speed, A1 app performance, and a UHS Speed Class 1 for the best performance available. So having all that extra space doesn’t come at the cost of reduced speed or increased load times.

Some tech magazines have lambasted the new card because it comes with a retail price of $249.99 on store shelves…. Which does make us wonder, how much would those tech reporters have been willing to pay back in the day to have more than 300K floppy discs of memory working at more than 1000x the access speed and taking up almost zero physical space? The price of progress isn’t always low, but it has proven itself time and again to be worth every penny.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
25
Aug
2017

The Constant in E = MC² May No Longer Apply to The Speed of Information

by Bill

One of the simplest, most elegant, most profound and far reaching of all human intellectual achievements was Einstein’s formulation of E = MC² to describe the fact that Energy is equal to Mass times the Speed of Light squared. The equation has lead to massive scientific advancements. It also posits the idea that the entire universe is restricted by one constant that never changes. Einstein suggested that the speed of light which is 2.99792458 108 m/s serves as a sort of universal speed limit that no particle can ever surpass, and countless experiments in the visible world seemed to back up that claim. However, in the incredibly microscopic world of quantum mechanics, things become much more fuzzy. Now, in data transmission, there are modern breakthroughs that suggest information need not obey Einstein’s formulations.

Using a principle known as Quantum Entanglement, scientists have discovered that two particles can be “paired” and that each will react to changes affecting the other instantaneously, even across huge distances. In a paper published in Science on June 16 the Chinese team reported that it had achieved its goal, measuring more than 1,000 pairs of photons to verify that they were indeed entangled, as predicted, and that team has gone on to make use of the entangled particles in a revolutionary new way of utilizing satellite communications.

The Quantum Science Satellite, nicknamed Micius or Mozi (Chinese: 墨子) was designed to establish a ‘Hack-Proof’ communications system of unimaginable speed and precision. Now initial tests are proving it actually works.

The key thing to keep in mind, is that with quantum entanglement data isn’t actually being sent anywhere or received anywhere. Instead, the paired particles are altered in one location and instantaneously become altered in exactly the same way in another location. That allows for the data to exist in two places at once without the need for anyone to send it anywhere… because it is already there the moment they create it.

Many publications are touting this new technology as a way to move data without any possibility of it being snooped or hacked by a third party, since there is no actual transmission. Yes, that is fundamentally accurate and impressive, but far too few are taking note of the fact that this also means data can be “sent” anywhere without speed even being a factor. As one example, if we sent a human crew on a mission to Mars, any traditional communication they would send to Earth would take at least 12.5 minutes to arrive because Mars is 12.5 Light Minutes away from Earth. Up until now that delay was not something that could be overcome. Planets further away or in other solar systems would require years of delay in communications… but if we sent that same crew with entangled particles, they would theoretically speak to people here in Earth with zero latency.

Now apply that same logic to modern computing and it becomes easy to see why everyone from High Frequency Traders on Wall Street to video game creators and ad networks are excited by the prospect of immediate information without the need to send data to or from anywhere ever again!

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
23
Aug
2017

FCC Transparency Called Into Question

by Bill

AJIT Pai, the Chairman of the Federal Communications Commission said in February that he wanted the agency to be “as open and accessible as possible to the American people.” Now people are calling that statement into question due to some key information that has become surprisingly hard to acquire.

The FCC’s recent handling of complaints from the public about internet providers and the still murky cause(s) of a May 7th outage of the public comments section of the FCC’s own website are garnering interest from politicians and the public.

“Chairman Pai promised to make the FCC more transparent, but the early returns aren’t looking good,” says U.S. Senator Ron Wyden (D-Oregon), in a statement. “The FCC seems more concerned with helping Big Cable than living up to his promise.”

Pai declined to be interviewed with Wired Magazine about the issues but a spokesperson told the magazine that the chair “is proud of the transparency measures he has instituted at the FCC.”

Still complaints persist about a lack of transparency at the FCC regarding the commission’s stated plan to reverse some of its own net-neutrality rules, which have always prohibited internet providers from favoring some forms of traffic over others.

The FCC has stated that it received only one formal complaint about the shift in policy, but fails to mention that the agency received more than 47,000 informal complaints about net-neutrality violations since the rules took effect in 2015. That’s significant because a formal complaint costs $225 to file, and often require lawyers, procedural rules, and written pleadings. Informal complaints can be filed online for free with a simple online form. Accurate accounting of actual complaints continues to be elusive and on July 26th, the American Oversight sued to obtain the records, but again the FCC declined to comment on the suit.

A May 7th outage of the commission’s public comment system followed a segment of the television show Last Week Tonight with John Oliver, in which the host asked viewers to file comments about net neutrality. The next day, the FCC blamed the outage on a cyber attack saying: “Our analysis reveals that the FCC was subject to multiple distributed denial-of-service attacks,” according to FCC chief information officer David Bray in a statement published on May 8th. “These were deliberate attempts by external actors to bombard the FCC’s comment system with a high amount of traffic.”

However, Journalist Kevin Collier filed suit against the agency after it did not respond to his April 26 FOIA request. The FCC told tech news site Gizmodo it had no records prior to Bray’s statement related to the “analysis” he referenced.

It becomes increasingly difficult for an agency like the FCC is claim net neutrality is not in the public interest, when clearly the public keeps saying it is… and the FCC has decided to stop accurately reporting what the public has been saying all along.

We will continue to monitor this story and the political chatter regarding net neutrality as it directly affects our clients and the Internet in a broader sense. When regulations change, business strategies must adapt and National Net is always about putting our clients first, with as much foresight as we can muster to assist in delicate process of staying ready for whatever comes down the pipe… transparently announced or otherwise.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza