SSAE 16 Type II Certified
888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus
03
Apr
2014

The End of Windows XP Whether Professionals Want It Or Not

by Bill

WindowsWith all support for Microsoft’s 13-year-old operating system, Windows XP, finally winding down and the final update announced, the news is filled with the news that 95% of the world’s ATMs are running this soon to be antiquated ,and no longer patched for security updates, operating system. With banks scrambling to upgrade their ATMs’ operating systems ahead of the deadline, it’s also being reported that 28% of web users are still running this old operating system.

With the final security patch scheduled for April 8th, machines running XP will likely be hit with wave after wave of cyber attacks the morning of the 9th, and no matter how bad the consequences might be, there will be no fixes proffered by Microsoft. The tech giant has previously warned XP users that the end of support will be the equivalent of a starter’s pistol for hackers, particularly as they can scour subsequent security patches issued for Windows 7 and 8 for exploits that will gain them access to systems still running XP. As an illustration, Microsoft revealed that XP shared 30 security holes with Windows 7 and Windows 8 that were patched between July 2012 and July 2013, which would have given hackers the ability to reverse-engineer XP vulnerabilities.

The specter of this looming deadline is frightening enough that the British government has announced that it will be paying £5.548 million (US $9.2 million) to Microsoft to provide an additional 12 months of “critical” and “important” security updates for Windows XP, as well as Office 2003 and Exchange 2003 for all of the UK government agencies who are still soldiering on the antiquated operating system, though as a condition of participating in the program, these government agencies, 85% of which are estimated to still be running XP, must institute plans to migrate to a current operating system.

While it’s tempting to simply heap the blame on Microsoft for the subsequent Windows releases’ not providing a compelling reason for their customers to upgrade to their later releases, particularly as Vista, Windows 7 and Windows 8 have been relatively poorly-received, this dropping of support has been a long time coming, even with the various stays of execution that have been issued over the past 5 years. That so many so-called “professional users,” major corporations and governments are being caught unprepared, to an even greater extent than the general public, is surprising.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
24
Mar
2014

US Senator Franken: Comcast-Time Warner Merger Threatens the Nature of the Internet

by Bill

net neutralitySabre-rattling over major mergers, net neutrality and the way bandwidth will be allocated continues to capture headlines. Now the US Senate is stepping into the fray and imploring regulators to be cautious with Comcast. US Senator Al Franken (D-MN) has raised what he describes as serious concerns about the proposed acquisition of Time Warner Cable by Comcast and its effects on consumers. The deal, he said, “could jeopardize the open nature of the Internet by tilting the balance of power from people to huge corporations.”

In a letter addressed to the Department of Justice’s Antitrust Division, Franken stated: “The Internet is an open marketplace where everyone can participate on equal footing, regardless of one’s wealth or influence – and I believe that’s the way it should be. The Internet has been a platform for innovation and economic growth since its inception. It also has connected Americans in unprecedented ways, facilitating the free exchange of information and ideas. Simply put, the Internet belongs to the people, not to huge corporations. Comcast’s proposed acquisition of Time Warner Cable could disrupt this balance of power, resulting in higher costs and fewer choices for consumers. Without open Internet protections, Comcast, Time Warner Cable, and other broadband service providers could block, degrade, or charge extra fees to transmit Internet traffic.”

Franken also wants to make peering an issue in the merger review. While big network operators have traditionally engaged in “settlement-free peering,” exchanging traffic without payment, recently consumer ISPs like Comcast have begun demanding, and getting payments from Netflix and its traffic providers, and is currently in a battle with Cogent, another one of the companies Netflix pays to distribute its traffic across the Internet.

Sen. Franken noted that prior to the NBC Universal acquisition, Comcast was sanctioned by the FCC for degrading traffic. That FCC ruling dates from 2008 and found that Comcast had “secretly degraded” peer-to-peer traffic on their network. As a condition on the approval of their acquisition of NBC Universal, Comcast is not allowed to block or discriminate against Web traffic, but Franken noted that “Comcast’s net neutrality obligations expire in January 2018, which raises the question of what happens after that time.”

The Comcast Time Warner Cable merger will be reviewed by both the FCC and DOJ, and in response to Franken’s comments, Comcast issued the following statement: “The Comcast Time Warner Cable transaction will bring millions more Americans under the Open Internet rules as soon as our deal closes. We fully expect that the FCC will have in place Open Internet rules that will apply to all companies by the time our current condition from the NBCUniversal deal expires in 2018. That condition was always meant as a bridge to enforceable rules that would be applicable to all companies in the industry. Comcast has supported the Open Internet rules since they were first proposed and is the only company that is currently required to abide by them.”

What makes these moves interesting is that appear to all be aimed at slicing and dicing the pie among a very small group of top end players with mention of consumers only insofar as it affects these monolithic deals on data usage. How these landmark moves will impact individual site owners and their visitors remains to be seen, and NationalNet will continue to track each plot-point for our customers to find as many innovative ways to create a competitive advantage as we can find.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
18
Mar
2014

US Commerce Department To Relinquish Remaining Control Over The Internet

by Bill

domains

US officials announced plans last week that would see the final vestiges of US authority over domain names and web addresses that is currently managed by a contract between the US Commerce Department and the Internet Corporation for Assigned Names and Numbers (ICANN), and is set to expire in September 2015.

The transition from US federal government authority to a new, yet to be determined, oversight system, and an international meeting to discuss the future of Internet is scheduled to start on March 23 in Singapore. The Commerce Department’s National Telecommunications and Information Administration has declared that the new governance model must ensure that ICANN is free from government influence. The plan must also fulfill several other conditions, such as preserving the security and stability of the Internet while keeping it open and free from censorship.

Business groups and some others have long complained that ICANN’s decision-making was dominated by the interests of the industry that sells domain names and whose fees provide the vast majority of ICANN’s revenue. Concern about ICANN’s stewardship has spiked in recent years amid a massive and controversial expansion that is adding hundreds of new domains, such as .book, .gay and most controversially: .sucks, to the Internet’s infrastructure. More than 1,000 new domains are slated to be made available, pumping far more fee revenue into ICANN.

Senator Jay Rockefeller, chairman of the Senate Commerce committee, in a letter to ICANN, blasted the “.sucks” domain as “little more than a predatory shakedown scheme,” designed to “force large corporations, small businesses, non-profits, and even individuals, to pay ongoing fees to prevent seeing the phrase ‘sucks’ appended to their names on the Internet.” One of the registry companies seeking to manage .sucks domain registrations has indicated that they plan to charge as much as $25,000 for brand registrations, which certainly lends credence to the “shakedown” accusations levied against the proposed new domain.

As the world seems to be shifting away from US regulation and ICANN continues to add new TLDs to the open market, the future of domaining appears to be as unsettled now as it once was when the first gold rush of .com domains were made available to the public. Question about who will own, regulate or market each new domain product are largely unsettled and with branding being such an

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
10
Mar
2014

The Inherent Hypocrisy of Modern Tech Patent Protections

by Bill

Patent StampThe 21st century has witnessed the age of the patent troll come to fruition, with several corporate patent-hoarding entities and unlikely consortiums created for the express purpose of taking tech companies to court, sometimes on the slimmest cases of alleged infringement, creating, in essence an innovation tax, with the litigation used as cudgel to extract payment from the companies targeted.

Twitter meanwhile has charted a different course, supposedly wants to reform the patent system by creating its own Innovator’s Patent Agreement (IPA) where it insists that its patents will be only used for “defensive purposes” and not in “offensive litigation” without the permission of all the inventors listed within the patent – hoping that the required unanimity will stall any lawsuits except in the most egregious cases of infringement. While it remains to be seen whether this is a viable policy for patent defense, Twitter still inhabits the real world, and recently paid IBM $36 million for a portfolio of 900 patents to avoid becoming embroiled in a lawsuit with Big Blue.

Whether Twitter’s IPA is the answer to the the problem of patent trolling or just the latest destined to fail attempt at thwarting the practice of patent trolling, it is widely agreed that the monetary and intellectual costs associated with defending lawsuits and the threat of litigation is a drag on innovation and corporate resources and something needs to be done sooner than later. The irony is of course that those corporations who decry patent trolling the loudest are often while engaging in it themselves, with companies such as Google, Microsoft, Samsung and Apple all fighting it out in high-profile patent disputes, their position on the fairness of patent trolling seems to shift from case to case based on whether they are the defendant or the plaintiff at that particular moment.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
04
Mar
2014

Netflix And Carriers Start Dealing While Advovates Lament The Outcome

by Bill

internet usageIn an agreement announced last week, Netflix will be paying Internet Service Provider (ISP) Comcast an undisclosed amount to “relieve internet congestion,” which is promised to improve Netflix’s streaming performance for Comcast’s customers. The Netflix/Comcast deal comes hot on the heels of Comcast’s announcement that it plans on acquiring Time Warner Cable for $45 billion which, if approved by federal regulators, will result in an ISP behemoth, dwarfing the nearest competitors and providing even more leverage for Comcast when negotiating with content providers like Netflix for adequate throughput to their shared customers.

The Comcast/Netflix deal is being called by many the death knell of net neutrality, even as the FCC scrambles to devise a strategy to ensure content providers have unfettered access to customers in the wake of recent court ruling on the issue, creating facts on the ground and precedent of a major content provider “willingly” paying for it’s bandwidth usage on their customers’ ISP.

Netflix’s streaming services has grin by leaps and bounds in recent years, reportedly constituting over 30% of all internet traffic during peak viewing hours, And Comcast and other ISPs have long been seeking to monetize the traffic. Under what has become the traditional internet business model, backbone providers would receive money from subscribers, whether from consumer ISPs or from businesses, but data transaction between backbone providers were handled via “settlement-free peering,” washing backbone transmission costs. Under this new model, ISPs who are already being paid by their subscribers on one side have realized a whole new source of revenue.

Netflix entering this deal says a lot about the likely future of net neutrality, and while Netflix is benefiting by entering this agreement, it is the other content providers that don’t have as deep pockets as the movie giant that the effects will be most pernicious. It’s not difficult to imagine Hulu, YouTube and other big players inking similar deals, but as industry heavyweights they can doubtlessly afford to pay to play. The unanswered question is what the future holds for the next internet breakthrough service, as it will face an uneven playing field, going up against the structural advantage the big players will now have baked into the mix.

It also remains to be seen how far down the hierarchy of content providers the ISPs are willing to go seeking additional revenue, and it’s not far-fetched to imagine a tiered pricing system imposed to extract every last nickel possible from those who contribute to the content of the internet from the giants like Netflix to the less bandwidth intensive but popular sites that would see having slow-throughput as a competitive disadvantage, given that today’s internet users are notoriously impatient when it comes to page loading times. With so much money at stake and and loading times mattering more than ever, having a reliable host optimizing your own bandwidth is also an increasingly important part of any successful business plan online.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
19
Feb
2014

Moores Law Alive And Well But No Longer Thanks To Intel?

by Bill

CPU“Moore’s Law,” the famous prognostication coined in 1965 by Intel co-founder Gordon Moore, predicted that computing power would double every 18-24 months. The Law has more or less been proven true for decades but has recently been widely reported to be reaching the end of the line soon, due to the simple fact that transistors have gotten smaller and smaller and have become too densely packed.

Many experts have gone on record to say that engineers are reaching physical limitations of power consumption and cooling, including Bob Colwell, Intel’s former chief architect stating that by 2020 Moore’s law will cease to be a viable paradigm, with a bold prediction that its passing will also have significant repercussions to the continuing development of the information economy.

That would be 50 years in a row of computing power doubling every two years more or less like clockwork suddenly screeching to a halt. Consumers accustomed to ever-increasing sophistication of computers, software and mobile devices would be left for the first time wondering what to do now that computers have reached a perceived physical limit.

Enter the “NanoFSM,” developed by a team from Harvard University and the non-profit military contractor The MITRE Corporation.

NanoFSM is an ultra-small, ultra-low-power processor—termed a nanoelectronic finite-state machine that is smaller than a single human nerve cell. It is composed of hundreds of nanowire transistors, each of which is a switch about ten-thousand times thinner than a human hair. The nanowire transistors use very little power because they are “nonvolatile.” That is, the switches remember whether they are on or off, even when no power is supplied to them – which has significant advantages in terms of heat dissipation. The team behind it believes it to be “the densest nanoelectronic system ever built” and that looks like it might allow Moore’s law to continue onward, just not under Intel’s stewardship any longer.

Others working to stave off the end of Moore’s Law at the University of California at Berkeley are working with nanoribbon graphene, a single layer of carbon atoms, just one layer thick that researchers are promising could lead to transistor densities on a computer chip as much as 10,000 times higher than what is achievable today, aided in large part by the unique material’s superconductivity.

Making predictions is a dangerous business, but just as the technology we take for granted today would seem like magic or science fiction a few short years ago, there are still brilliant minds working on maintaining the pace of innovation, and pushing it to new heights unimaginable by consumers today. The real lesson here is that any time someone tells you something is not possible, in Hosting, Computing, Cloud Software, Business, or Life… remind them, it’s only not possible… yet!

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
14
Feb
2014

Facing Inaction on the Part of Entrenched Broadband Interests Mozilla Acts

by Bill

Fiber to the homeInsiders have been seeing negative reports about the state of the US internet surfacing nearly every day, from our declining competitiveness compared to our overseas competitors to the collapse of Net Neutrality, but for a change of pace, there’s some good news to report.

The Mozilla foundation, the non-profit behind the wildly popular open source web browser Firefox, in an effort to push improvements in internet speed, has launched the Mozilla Gigabit Community Fund in conjunction with the National Science Foundation and US Ignite to provide grants to software developers in Chattanooga, Tennessee and Kansas City, Kansas, cities that are currently served with gigabit fiber services.

While the $300,000 is being provided by the National Science Foundation, the funding will be disbursed by Mozilla, 10 grants to each city, in amounts ranging from $5,000 to $30,000 for local software and application developers to come up with “killer apps” that will make use of the truly voluminous, up to a billion bits per second, super-broadband deployed in those cities.

The deployment of gigabit service which provides over 50 time the internet speeds of today’s standard cable and DSL services, has seen the entrenched players dragging their feet, preferring to harvest profits from their existing infrastructure, rather than investing in the new tech that will replace it. Google Fiber, the search giant’s fiber service has already rolled out in Kansas City, Kansas and Provo, Utah with Austin, Texas and Salt Lake City already announced as the next cities on Google FIber’s agenda.

Verizon for its part rolled out a fiber internet service, Fios, but has since ceased expanding its coverage, and there are some who say that there really isn’t enough demand for the truly blazing speeds that gigabit fiber service provides, a situation the Mozilla Fund is aiming to change. The fund will award money to projects that “demonstrate how emerging gigabit technologies are relevant in people’s everyday lives.” But in keeping with the Mozilla Foundation’s Manifesto, it wants to fund applications that are “rooted in the local community, and that are pragmatic, deployable in the near term, have measurable impact, and are re-usable and shareable with others.”

One sure way to answer the complaint by carrier that ‘high speed broadband isn’t needed because there is a lack of applications that use it’ is to do exactly what Mozilla is doing, by providing incentives to break the logjam. Now when someone asks what needs to come first, connectivity or applications that use it, developers can give the correct answer – Mozilla deserves the credit for coming first instead of waiting for either or both.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
06
Feb
2014

Net Neutrality May Find Resilience In Consumer Advocacy

by Bill

net neutralityMuch has been made of the recent court decision striking down the FCC regulations commonly referred to as ‘Net Neutrality’ which until recently required carriers to provide bandwidth in a manner that was completely agnostic to what kind of content it is, the source of the content or any other ancillary factors. In the wake of the court decision many industry participants, including NationalNet, posted an array of viewpoints about the impact the decision may have on eCommerce and the way the internet is priced for consumers.

One thing that seems to have been overlooked is the growing ability of consumers to advocate for themselves, band together quickly, and collect support effectively. Now there are several groups sprouting up with websites, online communities and tools designed to hold carriers to task as a matter of free market economics rather than governmental regulation.

As one example, the site http://netneutralitytest.com has posted a simple ‘throttle checker’ that allows anyone to test their actual bandwidth speed across a variety of different cloud services including AWS East, AWS California, AWS Oregon, Linode, Newark, NJ, Linode, Atlanta, GA, Linode, Dallas, TX and Linode, Fremont, CA with more to be added in the near future. In seconds, a consumer can see for themselves whether or not their carrier has decided to throttle down the internet speed they are being provided for some forms of internet usage as compared with others.

The purpose is simple and the impact may be profound. In the lead-up to the court decision that ended Net Neutrality many think-tanks presumed that consumers would be unable to know if they were being affected, or would mistakenly blame entities like Netflix for degraded video quality without understanding the big picture of what is happening to their packets as they are piped from Netflix services to their own device by carriers seeking to add new service tolls along the information highway. That presumption is being quickly disproved and may be proof of something even more fundamentally important, that the internet has come of age and that society as a whole now has at least a functional understanding of how it works.

If consumers take enough of an interest in which companies do or do not throttle access and vote with their choice of provider, it very well may be the invisible hand of the economy that spanks the wrist of carriers seeking greater revenue, rather than any governmental agency or Congressional action. After all, it’s an informed and actively engaged populace would likely be the fastest way to get greater pricing competition, marketing of  ‘unlimited neutral access’ and things of that sort on the horizon from companies like Comcast and Verizon.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
30
Jan
2014

Microsoft Makes A Bold Move With In House Servers And The Open Compute Project

by Bill

server cabinetsMicrosoft manages quite a bit of data for it’s Bing, Windows Azure and Office 365 services and has announced it is quietly utilizing its own server designs, bypassing the products of significant strategic partners like Hewlett-Packard (HP) and Dell. That Microsoft would not want to publicize that they were bypassing their traditional allies is completely understandable, and makes their announcement at the Open Compute Summit in San Jose, California even more surprising. Not only did Microsoft share its previously secret in house server designs, it also announced that it will “open source” these designs and the software, sharing them so that other online entities can use them inside their own data centers as well.

Launched by Facebook in 2011, the Open Compute Project was the result of Facebook’s rapid expansion and the high cost of using off-the-shelf servers to meet their immense data-handling needs. While other large, established players in the market like Apple, Google and Amazon had built data centers around the globe using their own lower cost designs, they each took a proprietary approach was a way to protect a competitive edge. Facebook chose to go the open source route and in so doing it was able to, as Mark Zuckerberg puts it, “blow past what anyone else has done.”

According to Zuckerberg, the utilization of Open Compute Project equipment instead of proprietary products from established server manufacturers has saved the social networking giant $1.2 billion, and with the higher energy efficiency of the open source hardware, Facebook was able to conserve the equivalent annual energy usage of 40,000 homes.

Microsoft has been careful to portray Dell and the HP in a positive light, and while Google and Facebook have their own equipment made by low-cost Asian manufacturers, Microsoft has thus far refused to reveal who has been building their machines. Further, it was announced that Dell and HP would be selling systems based on this open source design, maintaining the ongoing, mutually beneficial relationship these tech giants have long enjoyed together.

After years of trying to maintain control over the whole world of computing, and fighting the notion of open source wherever it could, to the point of being saddled with monumental anti-trust litigation, Microsoft seems to be changing its ways dramatically, as Steve Ballmer exits and new decision makers are coming into place. By sharing its designs and software, Microsoft may push the web forward, helping others build more efficient data centers, while at the same time lowering the cost of producing its custom-built gear, driving its hardware costs lower by increasing its ubiquity.

The open source movement also can benefit Microsoft by helping it sell more software, as the software that underpins Microsoft’s cloud services like Azure is designed to run on these servers, why wouldn’t developers use Microsoft’s software when implementing their own services?

In addition to Microsoft joining the alliance, a similar announcement was made by IBM, bringing the corporate membership within the organization to 150, and includes many tech heavyweights like Advanced Micro Devices and Seagate Technology as well. NationalNet will continue to utilize the best and most efficient servers available at the leading edge of the market to provide our clients with the fastest, most efficient and affordable throughput – whether that evolves into open source servers or not, time will only tell.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
27
Jan
2014

FCC Net Neutrality Rules Derailed By Federal Court of Appeals

by Bill

network cableSince the earliest days of internet connectivity, ISPs and broadband providers have been providing transfer speeds in a completely neutral manner. Whether a site is owned by a client or a competitor, the FCC rules on ‘Net Neutrality’ have preserved this overarching method of content distribution, even in the face of very powerful market forces. Carriers have long chafed at the idea that they could not charge premium fees for faster service or throttle distribution of some services that they argue are profiting from their inability to price their bandwidth pipes effectively.

One notable example is Netflix, a site that now accounts for a significant percentage of all internet traffic, which has benefited tremendously from the fact that it can provide clear video streaming services to customers without paying any kind of additional fee for the massive amount of infrastructure it utilizes each day. Other services owned by Google like YouTube, Gmail and Hangouts also rely heavily on their ability to serve content instantaneously to clients across point to point infrastructure that rarely belongs to them. Shouldn’t Comcast, Verizon and others be able to monetize their products to the full extent the market will bear, without government intervention preventing their full profitability, they argued – and the court agreed.

In a decision this month, the United States Court of Appeals for the District of Columbia Circuit ended net neutrality when it decided case No. 11-1355 in favor of Verizon and against the Federal Communications Commission (FCC) after Verizon appealed a lower court ruling that had gone in the FCC’s favor. While some may argue that the case is headed for the Supreme Court in some form or another, or that other methods exist for the FCC to create the same kind of regulations in a more legally justifiable way – rumors persist that the FCC does not intend to do so and that the Supreme Court is likely to side with the carriers as well.

Some pundits are already panicking and calling the ruling ‘the end of the internet as we know it’, while others are taking a more serious business-minded approach. One thing to keep in mind is that the carriers do not derive revenue by putting site owners out of business, and the FTC is unlikely to allow predatory pricing practices to get out of hand. That makes the notion that carriers are about to choke-off your ability to distribute content extremely unlikely. What may be much more likely is the birth of a tiered traffic system that follows the freemium pricing model now being used by many other sectors of the market.

Imagine paying Netflix $9.00 per month for basic service and watching standard definition video, or having access to three videos per day, with a $15.00 per month premium package allowing you to watch as much as you like, and see it all in HD. You may not need to imagine it for long, because it is one of the most likely real word outcomes to be spurred by this ruling.

From Netflix perspective it will be a major marketing obstacle. The premium service pricing will likely be seen by many as a money grab (the same sort of viewpoint that almost sank their brand entirely when they tried to change pricing models last time). However, in reality, most if not all of that premium fee will be forwarded along to the carriers as a way to pay the increased costs which will now be allowed without net neutrality in place.

From the carrier perspective, it represents an enormous amount of additional revenue with little or zero additional expense. And their spin-agents will be hard at work trying to convince the world that this is not a case of them getting extra money now, it’s a case of them having not gotten their fair due over the past two decades until now. A very hard pitch to make successfully, whether you find truth in it or not.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza SSAE 16 Type II Certified