888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus

Monthly Archives: July 2013

22
Jul
2013

The End of “Unlimited” Cable Bandwidth May Be Fast Approaching

by Administrator

The End of “Unlimited” Cable Bandwidth May Be Fast Approaching

Cardozo Law School professor Susan Crawford is also an adjunct professor at the School of International and Public Affairs at Columbia University, a Fellow at the Roosevelt Institute, a former ICANN board member and a Special Assistant to the President for Science, Technology, and Innovation Policy. In her recent book “Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age”, Crawford took a deep look at the forces shaping the flow of bandwidth from carriers to consumers.

In an insightful Op-Ed Crawford penned for Wired Magazine she explores the future of consumer bandwidth as more consumers cut the cord and obtain their programming via ala carte services such as Hulu, Netflix and the like. The day is fast approaching when cable television is delivered in the form of IP packets just like any other data, and without billing package requirements artificially constructed by cable operators. With all of this  in mind, Crawford contends that the cable industry is working to eliminate “unlimited” bandwidth plans in favor of selling users bandwidth based on consumption.

In many areas, the local cable company has a virtual monopoly over bandwidth, particularly when it comes to the ability to provide the massive throughput that future content will require when publishers make 4K the new resolution industry standard.

Pushing for tiered pricing now, while it may be relatively easy to accustom users to the notion of paying per bit, when future bandwidth requirements would turn that kind of system into an unprecedented cash cow. Crawford foreshadows the arguments of providers like Liberty Media chairman John Malone who is likely to justify such pricing changes as a way of ‘limiting congestion’, when in reality according to Crawford it’s always about little more than maximizing profits. Most cable companies made their network investments quite a while ago and are now reaping massive rewards. In fact, according to Crawford, Time Warner Cable and Comcast have been bringing in revenue of more than seven times their investment in infrastructure for some time now.

Netflix has become a means by which many have reduced or eliminated their cable TV subscriptions, but with a use-based bandwidth business model that might change. Malone has already said it makes sense “So that, you know, Reed (Hastings, CEO of Netflix) has to bear in his economic model some of the capacity that he’s burning … And essentially the retail marketplace will have to reflect the true costs of these various delivery mechanisms and investments.” Translated into plain English, he’s saying that anyone who wants to transmit data over his network is going to have to pay him for the privilege and unless the consumer cost of bandwidth is tied in some way to the expense of providing it for consumers, the new more open arrangement where consumers can pick the specific channels they want may quickly start to be priced much the same way carriers have based their pricing for decades already

Any change to a per bit business model by bandwidth providers for consumers would have a profound impact on website owners as well. Proper optimization and hosting efficiency would take on an even bigger role in generating or maintaining traffic as visitors would undoubtedly become reticent to spend their time and precious bandwidth on sites that waste it unnecessarily. For that reason NationalNet continues to seek out every available method of delivering data in the most efficient manner possible, utilizing state of the art servers and hosting protocols carefully designed to provide the best user experience in the fastest and most economical manner possible for publishers and consumers alike.

 

Read more: http://www.wired.com/opinion/2013/07/big-cables-plan-for-one-infrastructure-to-rule-us-all

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
17
Jul
2013

5 Reasons to Host with NationalNet

by Administrator

Our Support:
Our support is what we have built our reputation on.  Our average response time to tickets is less than 5 minutes.  Our average resolution time is under 17 minutes.  We feel that waiting for support is not an option.  There are two types of tickets – trouble tickets (something is broken) or project tickets (you need something done).  If it’s broken, you want it fixed – NOW.  If it’s a project ticket, then we know you’re probably sitting waiting on us so you can move forward with your project.  No one likes waiting for support and we get that!

 

Our Network:
Today’s surfer will abandon your site if it’s too slow.  We understand that, so we spare no expense on our network.  Whether it’s our dark fiber gear or the Cisco networking equipment, our in-house network engineers have assembled all the right gear to provide the optimal network experience.  We also select our bandwidth providers based on eyeballs.  Our network engineers are constantly analyzing reports and real-time monitoring to make sure that the connectivity we utilize meets our traffic needs.  Between paid transit and private peering, we use over 40 different providers and ensure our routes are taking the most optimal path.

Peace of Mind:
Our customers sleep well at night knowing that NationalNet “has their back”.  Using our proprietary monitoring software built in-house and constantly improved upon, we monitor every server and every service on that server.  When an issue arises, we are alerted and a tech is tasked to immediately resolve the issue.  Our goal is to find and fix issues before the customer is even aware that there was an issue.  Our fully managed customers can focus on growing their business and leave the tech work to our talented staff.

Our people:
This is an area where we are proud.  We have put together a great team of people.  Whether it’s our system administrators, network engineers, facility engineers or billing, each person has been brought in with an eye towards a team effort.  One thing we have said is that “We can train skills, but We cannot train personality”.  We make sure that every employee is not only capable of doing the task, but that they also fit in with everyone else here.  It’s a great family-oriented mindset.

Our Mission Statement:
It’s very simple. “100% customer satisfaction at any and all cost”.  Every employee knows this and they are all fully empowered to do whatever it takes to make the customer happy.

We’re a very old fashioned bunch here with a very simple philosophy in that if you take care of your customers, good things happen.  It’s a partnership – if our customers do well, then so do we…and vice-versa.  We have lived by this for well over 15 years and judging by the customer testimonials at http://www.nationalnet.com/company/customer-testimonials.html, we must be doing something right!

Find out more about the NationalNet story by visiting our web site at http://nationalnet.com or contacting us at sales@nationalnet.com

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
15
Jul
2013

Networked Hosting Experts Learn New Lessons From Ant Colonies

by Administrator

Networked Hosting Experts Learn New Lessons From Ant Colonies

Ant colonies have a seemingly supernatural ability to survive, and are perhaps the most successful species on the planet, populating every continent and major landmass except Antarctica while acting as a super-organism with each individual producing in perfect synchronicity with the rest of the colony to collect resources, provide security and care for the health of their entire infrastructure without any direct central control.

Interestingly enough, a group of researchers at Stanford, recently found that the algorithm that desert ants use to regulate foraging is very similar to the Traffic Control Protocol  (TCP) used to regulate data traffic on the internet. Both the Internet and the “Anternet” utilize data systems driven by positive feedback, with the acknowledgment of data receipt triggering the next data transmission in a electronic network, the same way the return of forager ants triggers the deployment of more ants to retrieve resources for the colony for a fertile source.

Another example of the ants solving human computing problems is their solution to the ‘traveling salesman’ problem discussed previously on the NationalNet Hosting Blog in our post about quantum computing. Researchers have dubbed it the ant colony optimization algorithm. While no individual has any idea of what’s going on at a larger scale, each ant keeps track of its own recent experience meeting other ants, in one-on-one encounters when ants touch antennas, or when an ant encounters a chemical trail that has been deposited by another ant. From this seemingly limited communication, extremely sophisticated behaviors have resulted, responding to challenges in the ant’s environment, from the extraordinarily clever strategies for survival of army ants against every conceivable condition or adversary, to the agricultural practices of leaf-cutter ants that “farm” fungus they grow on vegetation the colony collects. 

Modeling the strategies of ant colonies has lead to some important insights on dealing with operating costs and scarcity. Harvester ant colonies, native to deserts, must expend water to obtain water. The ants lose moisture when out foraging in the hot sun, and obtain their water from the seeds that they collect. During a 25-year study of harvester ant colonies, scientists conclusively showed that the colonies learned to manage resources more efficiently by laying low during hot periods, and that the best optimized colonies had the largest number of offspring – evolving beyond the capabilities of other colonies that mistakenly scrambled to maximize productivity without forethought or efficiency. Waste created water costs too high to overcome, are failing maximizing their return on investment lead to many colonies disbanding entirely.

In the face of scarcity, the networking algorithm that regulates the flow of ants is evolving toward minimizing operating costs rather than immediate accumulation. This is a sustainable strategy for any system, be it a desert ant colony or the always on architecture of the mobile internet where avoiding waste has become crucial to optimized performance and reduced overhead expense.

Like human-engineered systems, ant systems must be able to scale up as the colony grows, with the ideal solutions ensuring that the benefit of additional resources outweighs the cost of producing them. Security, Reliability, Redundancy and Minimal Overhead are hallmarks of the Anternet. The same is true of any high quality human managed hosting solutions. Large, top-down solutions may seem appealing, but the ant algorithms show us that a simple ground-up approach can achieve a higher degree of sophistication with a reliability and error elimination capacity that other systems can’t match. Next time you find a few ants on your kitchen counter, take a moment to marvel at the elegance of the design of the network they operate within. NationalNet strives for the perfection in the way we provide reliable and cost-effective collocation hosting services for our clients.

Whether the best idea come from human engineers or billions of ants, our team of highly qualified system architects are ready to providing you and your customers with the best and most reliable managed hosting services as a result of our own ability to continue adapting and evolving our data centers to new ideas and system strategies.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
08
Jul
2013

Quantum Computing Becoming A Reality – Leading To Quantum Cloud Hosting

by Administrator

Quantum Computing Becoming A Reality – Leading To Quantum Cloud Hosting

D-Wave, a company that made the controversial claim that they have built the first world’s first quantum computer, got a big boost recently when researchers at the University of Southern California (USC) announced that they have confirmed that the machine is not utilizing “simulated annealing,” which was the leading theory of how the machine worked according to detractors who insisted that the computer could not possibly be a true quantum computer.

Quantum computing has been something of the holy grail of computing since it was first proposed in 1985 by British theoretical physicist David Deutsch. Unlike a classical computer which processes data on a transistor with a “bit” that can be “on” or “off” to yield the series of  “1s” or “0s” used to create binary code – a quantum computer would operate according to the strange principles of quantum mechanics which counter-intuitively allow data to exist in two states at once known as a “qubit” rather than a bit. In essence, a qubit can be a “0” and a “1” simultaneously under a rule known as the superposition principle of quantum mechanics. So, if a qubit can store a “0” and “1” simultaneously, and you add another qubit, they can hold four values at the same time: 00, 01, 10, and 11. As more qubits are added to the string, you end up with a computer that is exponentially more powerful than anything achievable with a conventional computer.

In theory quantum computers would solve combinatorial optimization problems. The classic example is figuring out the most efficient route for a traveling salesman going to multiple destinations. There’s no mathematical shortcut that conventional computers can take to solve combinatorial optimization problems. They are forced to run all possible combinations and determine the best path after all the calculation are complete. The problem with this approach is that the number of combinations rises exponentially with the number of variables. If there are six destinations, there are 64 possible combinations, 8 destinations have a total of 40,320 combinations, and with 20 destinations, the number explodes to 1,048,576 combinations.

D-Waves quantum computer is designed to handle as many as 512 variables, which results in a number of combinations that are outside the realm of possibility for any computer bound to the rules of classical physics, as the number of combinations is reportedly larger than the number of atoms in the universe, though we’re taking the scientist’s word for that, we haven’t counted them all ourselves.

According to its maker, the D-Wave machine contains 512 superconducting circuits, each a tiny loop of flowing current. These are cooled to almost absolute zero, so they enter a quantum state where the current flows both clockwise and counterclockwise at the same time. When you feed the machine a task, it uses a set of algorithms to map a calculation across these qubits and then executes that calculation. What emerges after the temperature is raised is the solution. If it sounds a bit like magic, it’s due to the very strange properties at play in quantum mechanics, though it has immense real world applications.

Lockheed Martin gave D-Wave a sample problem to solve, a bug in its F-16 software that had taken a team of their best engineers on classical computers months to track down, and the D-wave machine allegedly located it much more quickly. So quickly in fact that Lockheed Martin bought a D-Wave machine with a price range reported to be 15 million dollars.

Until recently, outsiders presumed the D-Wave machine was using a traditional computational process known as “simulated annealing” which in layman’s terms involves tunneling through data and attempting to predict the most likely outcomes. Now a peer reviewed scientific journal article titled Experimental Signature of Programmable Quantum Annealing has now been published in the well-respected academic journal Nature Communications and seems to completely disprove that assumption. Stopping short of proclaiming the D-Wave a true quantum computer, the paper does prove it is not using simulated annealing and moves the machine one step closer to full recognition.

The applications for this technology are nearly limitless. In addition to the aforementioned route planning, they include things like image recognition, genome sequence analysis, protein folding, scheduling, and risk analysis of the sort that high frequency stock traders rely on. So far defense contractor Lockheed Martin in partnership with USC and Google in partnership with NASA have bought D-wave machines.

D-Wave is also planning to launch a “Quantum Cloud,” in addition to selling it’s computers outright. With this service massive amounts of data will flow, providing near-instant solutions to remote computers, without the customer having to purchase or maintain the complicated and expensive super-cooled machines.

On the brink of a whole new kind of digital revolution, NationalNet will continue to keep pace with each technological advance, and stands ready for the onslaught of data that will necessarily accompany each advancement of the Fully Managed Hosting, providing our clients and their customers with the efficient bandwidth and data transfer speed that have made NationalNet the leader in modern hosting solutions presently and a forward thinking collocation partner for all of your hosting service needs.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
03
Jul
2013

NationalNet enjoys company 4th of July Picnic

by Administrator

NationalNet treated it’s employees to a company picnic lunch, held in the break room because it was raining.  All employees brought their favorite dish and we had a barbeque company do the ribs and chicken on their grill for us.  Awesome food and fun was had by all today!

 

 

 

 

 

 

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
01
Jul
2013

Google Chief Engineer Ray Kurzweil Shows Future Of Hosting And Data Security

by Administrator

Ray Kurzweil is a renowned futurist who at age 64 was brought into Google late last year to preside over a loosely defined set of projects involving the storage and collection of data with the purpose of replicating the human psyche. After a long and prestigious career that has included winning the 1999 National Medal of Technology and Innovation, being credited  with creating the first text-to-speech synthesizer in the 1970s and authoring a best selling book titled “The Singularity Is Near” in 2005, Mr. Kurzweil raised some very interesting challenges on the horizon for fully managed hosting providers like NationalNet.

During the Global Futures 2045 International Congress, held in New York City last week, Kurzweil who has become known for some astoundingly accurate predictions about the future of technology years before they each came to fruition, stated emphatically that he believes humanity will be able to upload our minds to computers within the next 32 years and that within the next 90 years our bodies will be largely replaced or augmented by machines.

“Based on conservative estimates of the amount of computation you need to functionally simulate a human brain, we’ll be able to expand the scope of our intelligence a billion-fold” said Kurzweil, who went on to refer to Moore’s Law that states the power of computing doubles, on average, every two years while also cross-referencing the impact of recent developments from genetic sequencing efforts and 3D printing enhancements throughout his seminar.
Theorists may argue about the time line, claiming that what Kurzweil calls the ‘Singularity’ (a point when humans will effectively achieve intellectual immortality) may take place sooner or at a point more distant on the horizon of human achievements – but from a pragmatic perspective his comments underscore the increasing importance of data security and hosting that is always accessible to any intended audience.

Today, people expect their images and methods of communication to work flawlessly without fail or interruption. It’s easy to imagine how much higher the stakes will become when clients are asking to create backups of their grandmother’s consciousness or storage for your own life memory files to keep data restoration backups as current as possible at all times.

The bleeding edge of technological advancement, where science and science fiction rub up against each other so frequently, are vitally important for anyone interested in choosing to be properly preparation rather than frantically trying at the last minute to enjoy the benefits of emerging tech the moment it is unveiled on the market.

NationalNet already utilizes industry best practices and redundant procedures to secure the data of our clients to the highest preservation level possible as of this time. More importantly, we always have an eye toward what will soon become possible and you can remain assured that your sites (and perhaps one day your conscientiousness) will be fully protected by our expert staff utilizing the continually evolving scientific know-how necessary to keep your most valuable assets performing with peak optimization and maximum safety.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza