888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus
18
Nov
2013

A New Round of Attacks on Net Neutrality Policy

by Administrator

net neutralityThe DC Circuit Court, which is just one step down from the United States Supreme Court, is hearing arguments in a case some believe may lead to striking down the nation’s Net Neutrality Law, which was officially adopted by the Federal Communications Commission (FCC) in 2010 and has been in place since the earliest days of internet development. While the courts may yet decide to uphold the law in this particular case, many experts contend that it is only a matter of time before Net Neutrality is eventually litigated out of existence.

As conceived, Net Neutrality means that ISPs are required to provide the same level of throughput between content providers and consumers, but the owners of these “data pipes” often have their own content that they would prefer to send, or are simply looking for an additional revenue stream which would be available if they could charge site owners for feeding their bits to consumers faster, or as a way to hamper competition by throttling unaligned content providers’ throughput so as to make their in-house or allied content load faster and become more attractive to consumers instead.

As a small, but growing segment of the population “cuts the cord” and moves to receiving their entertainment through internet-based services like Hulu, Netflix and Net based providers, cable and telecom companies which have their own cable television-style entertainment packages on offer, would like nothing better than to be able to charge Netflix or other providers for premium speed and connectivity packages. Though their point of view seems to ignore the fact that consumers are already paying quite handsomely for broadband service as it is.

As previously reported here by NationalNet, Netflix accounts for more than 31% of all web traffic and YouTube accounts for nearly 19% of all web traffic. That means just from those two entities alone, half of all web traffic can expect to see worse service or higher prices if the Verizon case is decided in favor of the carriers.

The case presently before the court has been brought by Verizon, objecting to requirements that they practice nondiscrimination among websites and application sources. Based on the statements of the justices during the case’s oral arguments it would appear that the ruling will be made in favor of the phone and cable companies, while they will likely uphold the “no blocking” provision which prohibits ISPs from completely blocking access to non-preferred sites according to experts watching the case closely. That outcome would beg the question, is there really a difference between Verizon ‘blocking’ Netflix and Verizon simply making Netflix load 90% slower than a competing movie service owned by Verizon or one of it’s allies?

If the nondiscrimination rule is struck down, the implications for consumers and website owners will be profound. No longer will sites live or die on the merits of their content, as gatekeepers like AT&T, Verizon, Comcast and other major players will be able to demand content providers pay up, or have their throughput slowed and their traffic turned away in droves – presumably, straight into the arms of in-house competition or other providers that have anteed up for a better data connection. Netflix, Hulu and iTunes in particular will have their business models turned upside down, as will any other high traffic low cost content provider. Cost will undoubtedly be passed on to consumers and for those who run commercial sites, a new pay to play model becomes particularly problematic, as users have shown very limited patience for slow page loading times, with 25% abandoning a page if it takes more than 4 seconds to load.

If left unfettered by the Net Neutrality rule, ISPs would be able to throttle any site for any reason, whether to strong-arm a fee, drive users to an in-house alternative or as part of some larger scheme. When questioned about his client shaking down site owners to provide reliable service, Verizon’s own lawyer in the case stated: “I’m authorized to state from my client today that but for these [FCC] rules we would be exploring those types of arrangements.”

If the court throws out the nondiscrimination rule, costs for site owners and consumers may rise significantly while innovation gets stifled. Even if upheld, Net Neutrality is still on shaky ground as there will always be a great deal of money at stake for those fighting to get it overturned, and this a war they clearly do not intend to lose.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
03
Sep
2013

Consumer Broadband Connectivity May Be Stagnating In The United States

by Administrator

The Pew Research Center recently announced that only 70% of Americans have broadband internet access, which is up slightly from 66% in 2012, but couched within that number are some distressing statistics.

First, the definition of “broadband” used by the pollsters is anything better than dial-up, with both Pew and the FCC classifying any connection of 4 Mbps for downloads and 1 Mbps for uploads as fast enough to be counted as “broadband,” which is not a very useful metric for the demands of the modern era.

Secondly, the “digital divide” shows no signs of abating, with 89% of college graduates, and 88% of households with more than $75,000 in yearly income, having high-speed internet access at home – while broadband penetration for households of those who did not complete high school stands at a paltry 37% and at just 54% of households with incomes less than $30,000 per year. Suggesting that broadband access is becoming something that fuels economic disparity rather than something that is helping lower income individuals to join the middle class.

Racially, African-Americans and Latinos are far less likely to have high-speed internet access than their Caucasian counterparts, though Pew points out that many blacks and Latinos are starting to mitigate these numbers by utilizing smartphones as their primary connectivity platform, and that among those who don’t have home broadband, approximately 32% have mobile devices, accounting for an additional 10% of Americans with internet access and bringing broadband adoption more in line across racial demographics if one counts smartphones as broadband devices, which many experts consider to be a dubious assertion on its face.

In a recent editorial for Wired Magazine, Susan Crawford, a professor at Columbia University and the Cardozo School of Law, and author of Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age, observes: “You could say DSL, satellite, and mobile wireless are all “high-speed broadband,” but that’s just like putting your local high-school football team in the same market as the New York Giants. It’s all football, but the two don’t — and can’t — compete.” With the ever-increasing bandwidth demands of the 21st century, Ms. Crawford argues that universal broadband adoption is what is necessary for America to remain competitive in the new information-centric global economy.

While there have been some moves to bring ultra-high-speed internet access to Americans, such as Google’s famed fiber-to-the-home network in Kansas City, there are powerful forces hard at work amongst the incumbent providers to make restrictive metered usage the norm, delivered through their existent “skinny pipes.”

Meanwhile other countries are stepping up, and making the adoption of true broadband fiber connections affordable for their citizens. The government of New Zealand for example, reduced the risk of up-front investment in fiber networks by financing the building of basic network infrastructure. The final connections to homes are built by private business entities, who then buy back the network from the proceeds of user’s subscription costs. As the government’s investment was returned, it reinvested those funds in further expansion of the country’s fiber infrastructure – creating an economic stimulation and a self-perpetuating roll out of a world-class broadband access for its entire population with a very fast adoption of fiber-to-the-home services.

The shifting economic forces of the new knowledge based economy and slow adoption of access by large segments of society may signal a need for Americans to rethink policy with regard to broadband internet access. As always there is plenty of room for American capitalism in the future of connectivity, but that presupposes a responsibility among business interests to produce adequate and affordable options for consumers in order to avoid creating a competitive disadvantage for ourselves as a nation in the global marketplace.

NationalNet continues to invest heavily in the future and takes significant steps each quarter to maintain the competitive advantage our fully managed hosting and server collocation clients have over other companies. Our own efforts are always subject to the greater environmental question of how many people have access to the internet and how robust that access becomes. Bringing modern advancements to consumers requires a modern infrastructure that can be relied on for the speed, throughput and efficiency that digital content, products and services necessitate. Hopefully the governmental and corporate entities in a position to roll out these advancements all understand how vital their contributions are to the future of our great nation

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
05
Aug
2013

Dropbox – Data Ubiquity And An Agnostic Platform Approach to Hosting

by Administrator

Dropbox – Data Ubiquity And An Agnostic Platform Approach to Hosting

Previous attempts like Apple’s MobileMe and Microsoft’s Briefcase aside, the notion of having files available to users across all platforms is a simple concept but has turned out to be very difficult to execute. In the end, users are often forced to email themselves files or carry physical copies of files around on discs and external drives to manually upload the latest edition of their files in a very inefficicient of system of syncing mobile, desktop or storage devices together.

Dropbox is a startup founded in 2007 that launched their flagship services in 2009 and created a simple virtual box on a whole host of devices, agnostic to which platform the user chose – PC, Mac, iPhone, whatever. The company’s motto, coined during a pitch meeting with venture capitalists is succinct: “It just works.”

In 2009, just a few months after launching their service, Drew Houston and Arash Ferdowsi – Dropbox’s twenty-something year old founders were called for a meeting at Apple to meet with the team handling the computing giant’s MobileMe service. The question posed floored them: “How did you get in there?” referring to Dropbox’s seamless integration within OSX’s “Finder” application. The answer was that they had essentially hacked their way in, using the program’s processing server to insert their own code. Ironically, though MobileMe was an Apple program, Apple’s own programmers were not allowed to make changes to Finder’s code. The meeting wasn’t an interrogation, it was an honest inquiry, as MobileMe was quite publicly floundering in the marketplace and an inability to synch files was the biggest deficiency it faced.

With no access to source code, Dropbox had discovered the assembly language that draws the icons and then squeezed in their icon, repeating the process for every different permutation of the operating system including Tiger, Leopard, Snow Leopard, 32 bit, 64 bit – all the while achieving the task that MobileMe couldn’t because the Dropbox team was not hindered by Apple’s own cross-department refusal to cooperate.

Similarly, Dropbox worked out integration with Microsoft Windows products, iOS and Android making it a complete cross-platform solution that “Just Works.” Dropbox’s competitors have been playing catch-up, Apple’s MobileMe became iCloud, which works really well for users who inhabit an Apple-only closed universe, Google has launched a web-based collaboration service accessed through web browsers, Amazon has its CloudDrive, an online storage locker, Microsoft Has Windows Live featuring SkyDrive which brings some automatic synching and cloud-based Windows programs to those who inhabit a Microsoft-only universe.

However, in an era when each tech titan tries to create walled gardens and closed ecosystems, Dropbox continues to capitalize on their technical lead, as the only competitor that offers true cross-platform ubiquity. Now Dropbox is opening up and allowing other services to piggyback on their system. Notably, Yahoo Mail is integrating Dropbox functionality with their email service, as are app makers like as Shutterstock, Check, Outbox and Loudr.

Dropbox seeks to become the universal standard and if their vision comes to fruition, it will mean that users will have finally have seamless integration of all of their data, so a game of Angry Birds can begin on an Android phone, and continue on your iPad. Switching from one smartphone platform to another would be seamless and painless as your data becomes truly ubiquitous – always available regardless of where you are and what device you’re choosing to use.

Just the way carriers have tried for years to artificially restrict access and treat their bandwidth as a branded resource rather than a generic option; device makers may soon start to find their software advantages eroded as developers produce apps and services that work across all platforms.

The idea of data ubiquity and an agnostic approach to access is something that NationalNet Hosting adopted long ago. When our clients seek collocated servers in our data center, custom software or unique management options that serve their business best, our team always takes the approach that is best for our clients. Even in instances where a fully managed hosting client requests certain hosting changes we are aware of the importance of keeping the same agnostic approach to technology in place. In an era where so many great new ideas are being launched by competing platforms, being able to cherry-pick the best of them and apply them all properly is the clearly the best way to go.

For More Information:

 

 http://www.wired.com/wiredenterprise/2013/07/dropbox/

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
22
Jul
2013

The End of “Unlimited” Cable Bandwidth May Be Fast Approaching

by Administrator

The End of “Unlimited” Cable Bandwidth May Be Fast Approaching

Cardozo Law School professor Susan Crawford is also an adjunct professor at the School of International and Public Affairs at Columbia University, a Fellow at the Roosevelt Institute, a former ICANN board member and a Special Assistant to the President for Science, Technology, and Innovation Policy. In her recent book “Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age”, Crawford took a deep look at the forces shaping the flow of bandwidth from carriers to consumers.

In an insightful Op-Ed Crawford penned for Wired Magazine she explores the future of consumer bandwidth as more consumers cut the cord and obtain their programming via ala carte services such as Hulu, Netflix and the like. The day is fast approaching when cable television is delivered in the form of IP packets just like any other data, and without billing package requirements artificially constructed by cable operators. With all of this  in mind, Crawford contends that the cable industry is working to eliminate “unlimited” bandwidth plans in favor of selling users bandwidth based on consumption.

In many areas, the local cable company has a virtual monopoly over bandwidth, particularly when it comes to the ability to provide the massive throughput that future content will require when publishers make 4K the new resolution industry standard.

Pushing for tiered pricing now, while it may be relatively easy to accustom users to the notion of paying per bit, when future bandwidth requirements would turn that kind of system into an unprecedented cash cow. Crawford foreshadows the arguments of providers like Liberty Media chairman John Malone who is likely to justify such pricing changes as a way of ‘limiting congestion’, when in reality according to Crawford it’s always about little more than maximizing profits. Most cable companies made their network investments quite a while ago and are now reaping massive rewards. In fact, according to Crawford, Time Warner Cable and Comcast have been bringing in revenue of more than seven times their investment in infrastructure for some time now.

Netflix has become a means by which many have reduced or eliminated their cable TV subscriptions, but with a use-based bandwidth business model that might change. Malone has already said it makes sense “So that, you know, Reed (Hastings, CEO of Netflix) has to bear in his economic model some of the capacity that he’s burning … And essentially the retail marketplace will have to reflect the true costs of these various delivery mechanisms and investments.” Translated into plain English, he’s saying that anyone who wants to transmit data over his network is going to have to pay him for the privilege and unless the consumer cost of bandwidth is tied in some way to the expense of providing it for consumers, the new more open arrangement where consumers can pick the specific channels they want may quickly start to be priced much the same way carriers have based their pricing for decades already

Any change to a per bit business model by bandwidth providers for consumers would have a profound impact on website owners as well. Proper optimization and hosting efficiency would take on an even bigger role in generating or maintaining traffic as visitors would undoubtedly become reticent to spend their time and precious bandwidth on sites that waste it unnecessarily. For that reason NationalNet continues to seek out every available method of delivering data in the most efficient manner possible, utilizing state of the art servers and hosting protocols carefully designed to provide the best user experience in the fastest and most economical manner possible for publishers and consumers alike.

 

Read more: http://www.wired.com/opinion/2013/07/big-cables-plan-for-one-infrastructure-to-rule-us-all

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
03
Jun
2013

Is your web hosting ready for 5 Gigabit Wireless?

by Administrator

5G Gigabit Wireless is Coming

Researchers at Samsung have successfully developed an ultrafast wireless technology they are officially dubbing “5G” (here we go again with the naming games- remember 4G LTE vs WiMAX vs EV-DO Rev C?). The technology has already be tested on the data-congested streets of New York City, showing gigabit speeds using an array of 64 antennas. The problem of building those antennas into a practical device that fits in your pocket is yet to be solved.

With ever increasing speeds over wireless devices, is your company’s data hosting solution ready? National Net’s colocation services are! We offer burstable bandwidth options, or choose your own carrier from hundreds in our SSAE 16 certified N+1 data center.

Contact a National Net sales associate to talk about a business colocation solution tailored to your company’s needs. Call us at 1-888-462-8638 or find us online at http://www.nationalnet.com

SOURCE: http://ow.ly/lFiXV

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
09
Nov
2010

Unlimited Hosting – Is There A Catch?

by Administrator

If you have ever done any searching for a web hosting company for your site, you have undoubtedly come across web hosting companies promising “unlimited hosting” for some “to good to be true” price. You asked yourself “how is this even possible” when you see other hosting companies offering the same package for much higher prices. So, is unlimited hosting really true? As far as I know, the only thing that is truly unlimited is outer space. That being the case, how can these hosting companies offer “Unlimited Hosting”.

First, you must consider that their traditional customer, or the customer that they are targeting will not use very much in the way of resources. The typical web site will use minimal bandwidth or traffic, minimal disk space and may have only one or two email addresses. Of course, this is what the host is counting on and this offering is a great way for them to get your business. However, should you start using what they consider too much bandwidth or disk space, you will find a very nice email in your email box telling you that you’re going to have to upgrade some part of your plan because “we found that you are utilizing resources to the point that is affecting other customers” or some other similarly worded email. The key to unlimited hosting is not only knowing, but also understanding, the fine print.

If you read the fine print at many of these hosts (which may not be so easy to find) you will see numerous caveats. Below are just a few of them I’ve found during my research.

 

1.We reserve the right to change the terms of the package at any given time. They usually list some time frame for the notice – usually 21 to 30 days.

2. Email accounts have limited storage capacity. Of course, for a small upgrade fee you may increase your email storage.

 

3. Backups are not included but for a small fee you can add a backup plan

4. Your bandwidth is part of a shared network. This means that they have allocated a pool of bandwidth for the unlimited customers. The downside to this is that any customer could effectively cause your site to run slower due to the lack of unavailable bandwidth.

5. Large videos are not allowed. What they consider a large video is anybody’s guess.

6. Support is not included with this plan.

7. You may not install any scripts which may affect the performance of the server.

8. You can add all the content you wish but maybe not all at the same time. The vast majority of our customer’s sites grow at rates well within our rules, however, and will not be impacted by this constraint. What exactly does this mean?

9. You may not use your disk space as an off-site backup source.

10. Database servers have a limit to the number of concurrent connections.

 

Does this mean that you should never use these hosting companies? Of course not – but you should definitely approach with caution and keep expectations low. If you have a small site that you know won’t use a lot of disk space or server resources, then a small affordable “unlimited hosting” package may be perfect for you. Just remember the phrase “Caveat Emptor” which is Latin for “Let the buyer beware”.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
07
Sep
2010

Do I Need a CDN?

by Administrator

What exactly is a CDN?

As someone who runs a hosting company, I get this question on a regular basis as we have a couple of different CDN offerings. Many people have heard of a CDN, usually from a friend or an online article, but few people know what it does or how it works, so let’s start with a definition and overview.

CDN is short for Content Delivery Network. CDNs were created to deal with proximity issues when delivering content from a web site. What this means is that the further away the surfer is from your web site, the slower it will load for them. If your web site is hosted in the US, a surfer in the US is going to see much faster load times than a surfer in Australia, simply because they are closer to the web site server.

A CDN consists of storage servers called “caching servers” or “edge servers” that are strategically placed around the world. Web site content is pulled from the “origin server” (the server that hosts your web site) and is pushed to all the edge servers. When a surfer makes a request for that content, the CDN first determines where the surfer is, then finds the nearest edge server to the surfer and finally confirms that the requested content is on the edge server. If the content exists on that edge server, it delivers that content from the edge server to the surfer, thus providing faster delivery of the CDN content to the surfer. If the CDN determines that the content should exist on the edge server but for some reason does not exist, it will pull it from the origin server and place it on the edge server. By using href tags in the HTML code for the site, the webmaster can control what content should be on the CDN and what content should be delivered directly from the origin server. Now that we know what a CDN is, let’s determine if you need one.

In the description above, we learn that a CDN has to go through a decision process to determine what to do (where is the surfer, does the content exist, etc). This decision process takes a second or two, which can add a delay to the loading of the content. Due to this decision process, it should be apparent that a CDN is not ideal for all sites or all content. A CDN was designed with larger content (files) where a 1 second delay is not critical. A CDN is also designed for “popular content” i.e., content that is accessed often. Edge servers do not have infinite disk space so all CDNs automatically expire or delete content on the edge servers if the content has not been accessed in some time. This time span is configurable by the site owner but usually it’s 30 days. So, it would not make sense to put up content that is only accessed occasionally.

A typical web site is hosted in a single location at a hosting company, and the web site usually consists of web pages of text and images and has been, or SHOULD BE optimized for the best performance. If a web site is properly optimized, it should load pretty quickly from anywhere in the world. Text is very small and most web site images are very small as well. Putting this small content on a CDN will actually defeat the purpose due to the decision process explained above. However, if you have large images, large downloads (zip files, software files, etc) or movie files, a CDN is perfect for this application, provided you have popular content to deliver. We have found that the best use of a CDN is to deliver the web site text and smaller images directly from the origin server and put the larger files on the CDN, thus providing a “best of both worlds” experience for the surfer. Large streaming movies will play much faster on the CDN. Large zip file downloads will download much more quickly on the CDN. The one exception to putting smaller files on the CDN would be a JavaScript file, or some other file that never changes and is always loaded on every page. We have seen some customers use the CDN for JavaScript files and have good success with them.

What a CDN is NOT good for…

CDN edge servers are just very large disk arrays and they are very good at delivering content; however, they are not designed for processing, so you usually cannot put php files or any other scripts/programs that require server side processing. You can put javascript on them because it is processed on the client side by the browser. CDN providers want their edge servers to “shovel content” to the surfer and nothing more. Adding in server side processing would simply slow down the CDN and also create a new level of un-needed complexity.

CDNs are also not cheap. Because of the infrastructure required, as well as the software that runs it, there is a significant investment required to build a CDN. This means that CDN bandwidth can run 2-10 times more than regular bandwidth provided by your hosting company.

Finally, if most of your surfers are in the same area as your web site, i.e., your web site is in the US and most of your surfers are in the US, there is no benefit to having a CDN, as you’ll be paying extra but not really seeing any faster speeds.

So, if you have a web site that delivers larger files, whether streamed or downloaded, you have surfers all over the world and you wish to give them the best possible experience, it may be time to see if YOU need a CDN.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
03
Aug
2010

Metered Bandwidth v. Unmetered Bandwidth

by Administrator

In my previous posting, I discussed the difference between throughput (95th percentile) and transfer (per gig). In this article, we’re going to dig a bit deeper and discuss the advantages of metered (un-capped) versus un-metered (capped) bandwidth.

We start with a question from Jonas, a reader, who asks, “With 95 Percentile, can I have my transfer rate limited to some upper limit so that I have a cap on what my bandwidth cost will be each month? I would worry that if I had 2 or 3 days with 10x my normal traffic I would have a heart attack when I get the bill.”

In a word: YES. You can have your bandwidth CAPPED, if you like, and if your hosting company will allow it. With that said, and I can only speak for NationalNet, we do not recommend our clients to cap their bandwidth and this is based on sound business practices. Here are some examples:

Example #1: You own a web site that earns you revenue, either by selling your wares on it or your clients pay a monthly fee to access your content. For whatever reason, your site starts getting an unusually high amount of good traffic, bandwidth goes UP, but so do REVENUES…and thus so do PROFITS! However, if your plan is capped, once you reach your capped limit, surfers are turned away or are greeted with a slow, almost impossible site to use. No one can make purchases on your site, or paying members cannot access your wonderful content, with the end result being LOSS of profit and revenue.

Example #2: You have a site that offers articles or content that others web sites are putting onto their web sites (and possibly paying you for your content) and your site gets listed on digg.com or is discussed on the major news outlets, thus getting hammered with traffic and bandwidth goes up. If you are capped, then it bottlenecks the site and surfers have to WAIT for the page to load until the people in line ahead of them are done…then your clients start complaining, possibly start canceling their service, because their surfers are complaining. Obviously, no one likes losing paying customers.

Of course, the previous two examples make the assumption that the traffic hitting your site is good quality traffic. However, there are times when your bandwidth is being stolen via hotlinking (another web site linking directly to images/content on YOUR server) and this is not desirable. At NationalNet, our monitoring system will alert us to abnormally high bandwidth, our system administrators will investigate and stop the thieves as well as notify you of this high bandwidth. However, you should make it a habit to check your website stats on a daily basis. Not only will this help you understand your traffic and visitors better; you’ll catch any large bandwidth jumps before they can be too costly for you.

So, as you can see, having a capped (unmetered) connection is probably not what you want. The key is making sure that you check your bandwidth stats on a regular basis and that you utilize a host that will watch it for you as well and alert you if the bandwidth starts to exceed your budgeted amount. Also, make sure that you know what bandwidth overages will cost you. Many hosting companies advertise their plans on their site and will list a server and some amount of bandwidth for X amount of dollars, but no where in the plan details does it list what the overages are, and in many cases, those overage charges are considerably more than what the regular commited rate is. Be sure you know what those charges are.

Now, with all of that said, there are times when an unmetered plan is exactly what you want. If you have a site that you know will never exceed your plan unless something really bad is happening, or you have a site that is not revenue driven, or you don’t really care if it’s slow at times, than an unmetered plan may be exactly what you need. Unmetered plans tend to be cheaper as well, due to the fact that the hosting company knows exactly how much bandwidth they must purchase and do not have to purchase extra bandwidth to cover overages and spikes.

 

Traffiic/bandwidth by its very nature is very spiky. On any given day, it goes up and down in fairly wild extremes. For instance, our own bandwidth graphs look like mountains and valleys. Joe surfer gets out of work, and the bandwidth goes up…and keeps going up until about midnight EST, when it starts going down. Special traffic deals, viral marketing, etc, all contribute to this “spikiness” (did I just make up a word?) Any host worth it’s salt must make sure that they have lots of extra bandwidth overhead to cover this spikiness, so that the actions of one or two webmasters does not affect everyone else.

It’s very expensive for a good host to pay for all that “bandwidth overhead”, but in the long run, it’s well worth it.

One final thing to be aware of regarding unmetered/capped plans is that many times these plans are on shared bandwidth. What this means is that the host or provider is actually capped themselves by their upstream providers, or that they have purchased a set amount of bandwidth and continually add customers to this set amount and hope that their customers never use all of the allocation. This is commonly called “overselling”. A good example is a host that has a 1 Gbps connection to their provider but sells 200 10 Mbps plans (the equivalent of 2 Gbps) on that single connection. The risk here is if even ½ of their customers use their entire allocation, all customers are going to suffer due to the lack of bandwidth to go around. Overselling is a risk that some hosts take, but NationalNet will never take. It’s not worth risking our reputation by having even one day where the network is slow due to overselling.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
23
Jul
2010

What is 95th Percentile?

by Administrator

What is this 95th Percentile (or, the difference between throughput and transfer)?

Many, if not most hosting companies sell and bill bandwidth based on a method called the 95th percentile. Many, if not most customers, don’t have a clue what the 95th percentile really is. In this article, I’ll try to shed some light on what 95th percentile is.

In order to explain, we must first understand the difference between the two types of bandwidth billing methods. Those two types are TRANSFER (95th percentile billing) and THROUGHPUT (per-gig billing). Let’s look at them individually….

Throughput is the actual total SIZE of the combined files that are sent by the server. Throughput is sold in Gigabytes (GB) and is an aggregate monthly total. So, for example, let’s say you have a web page called THISPAGE.HTML and the actual page is 25k, on this page you have 3 graphic images that are 25k each which is a total of 100k. If 100,000 people downloaded that page over the course of a month then your Throughput would be calculated as 100kB X 100,000 = 10,000,000kB or 10GB. So for that month your THROUGHPUT would be 10GB. This does not take into account if all 100,000 people hit the server at the same time or were evenly spread out over the course of the month; it is still 10GB of THROUGHPUT for the month.

Now, on to TRANSFER, but before we begin let me state that in *NO* circumstances can you mix Throughput and Transfer. It is physically impossible (it’s like trying to add up gallons and nickles). They are two different things.

TRANSFER is measured in Megabits Per Second (Mbps) and measures how much information is traveling through the Internet “pipe” at any given time. I like to compare TRANSFER to water in a series of water pipes. Imagine that your home PC has a water hose connected to it instead of an Internet connection. The water hose is 1/2″ and is connected to the side of your house where it meets a 2″ pipe and your house is connected to the Water Main, which is a 12″ Pipe. In this example your ½” water hose is your home Internet connection and your 2″ pipe to your house is your ISP and the 12″ water main is the backbone of the Internet. It does not matter how hard you try you are only going to get 1/2″ of water into your PC at any given time because the “pipe” is only a 1/2″ water hose.

Now if I were going to sell you water BY THE GALLON, that would be called Throughput (see above), or I can sell you a PIPE and just charge you for the amount of water that you push through the pipe at any given time…this is called TRANSFER. For example, if I take a measurement right now and you are pushing 1″ of water through the pipe and I look again in five minutes and you are pushing 1″ still and I look again in five more minutes and you are pushing 1/2″ and I look again in five more minutes and you are pushing 2″ then how big of a pipe do you need to accommodate your traffic flow without any water being backed up like a funnel??? You would need a 2″ pipe, but you are not using 2″ all the time, so why do you have to pay for a 2″ pipe all the time?? This is where the 95% comes in.

The 95th percentile (which is an industry standard) simply means that the hosting company will look at your pipe every five minutes and take a reading and add that reading to a long list that they keep for 30 days. At the end of the month that list will contain 8640 readings (there are 12 five minute intervals in an hour, 24 hours a day for 30 days). They will then take that list and sort it from the biggest number to the smallest number so that your largest five minute reading is on the top, the second largest is next, the third largest is next and so on. The top 432 entries (the top 5%) are discarded and the 433rd is considered your “95th Precentile” and that is the number that you pay for. The 95th percentile was designed to help chop off wild peaks and only bill you for what you are sustaining on a regular basis. This is a rolling 30 day number that is constantly changing. In other words, once you get the 8640 data points, every time a new data point is added and the list is sorted, the oldest data point is dropped off.

As for what is more advantageous, it depends on the traffic patterns of your site. THROUGHPUT (95%) is good for almost all sites with very few exceptions. TRANSFER is recommended for sites that have extremely high spikes or very inconsistent traffic. For example, if you have very high traffic every Monday but the rest of the week is very low traffic, then being billed on THROUGHPUT may be the best for you. In this case, you would have lots of big numbers due to that high traffic on Monday, which would create an inflated 95th percentile. However, very few sites have this type of traffic pattern.

With TRANSFER host should provide 95th percentile graphs (usually MRTG graphs which is the industry standard) and you can see your transfer yourself. You should check these graphs every day as they can indicate problems as well as let you know your traffic patterns. You should see highs and lows each day and these patterns of highs and lows should follow the sun. If you see a flat line across the top of the graphs then you know that your hosting company doesn’t have enough bandwidth to handle your needs (and this is much more common than one would think). ***IF YOU ARE BEING BILLED ON 95TH PERCENTILE MAKE SURE YOUR HOSTING COMPANY PROVIDES YOU WITH THOSE GRAPHS*** If they refuse, they obviously have something to hide.

Hopefully this helps you understand what 95th percentile is.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza