888.4.NATNET
Facebook Twitter LinkIn Blog Instagram YouTube Google Plus
15
Aug
2014

The False Sense of Security Provided By Complicated Passwords

by Bill

passwords don't always protectFor many years, anyone engaged in any kind of digital transactions has been conditioned to believe it is vitally important to choose long complicated character strings whenever creating a password is required. Sites and business support routinely remind customers to choose at least one lower case, one upper case and one number or punctuation character. In many instances a site will not allow a password choice unless it is at least 7 or 8 characters long and some sites suggest using passwords at least 12 characters long. However, even the longest most convoluted password choice is still capable of being subverted, which makes real Support the most important word in digital security.

As digital security analysts at Hold Security first reported, 1.2 Billion online credentials were compromised by a syndicate of Russian hackers. Target famously admitted to having many thousands of credit card accounts compromised recently as well. The number of new reports about massive account hacking operations continues to skyrocket, often resulting from dubious server security protocols (like merchants storing password information as plain text in some cases) or reactive support teams that address issues after they happen rather than by trying to prevent them.

If a crime syndicate scoops up a billion accounts and their underlying information at the code level, whether the list contains a 7 letter, 14 letter or 144 character password of yours, the complexity of your password choice matters very little. What immediately becomes very important is the security and support provided by your host, the companies you do business with and anyone else in the transaction chain responsible for your accounts.

As Robert McMillan astutely pointed out in an article, “Some of our ideas about passwords date back to the 1980s, when the National Institute of Standards and Technology came up some guidelines for creating secure passwords for local area networks. Back then, they’d mail them out to interested computer security types via U.S. Post. Now, NIST is trying to help the U.S. move beyond the password, says Donna Dodson NIST’s chief cyber security adviser. “Putting the burden of security on the end-user and making it more complex just doesn’t work,” she says. “The security has to be usable for the end-user. Otherwise they’re going to find workarounds.”

At NationalNet we have dedicated decades of time, training and experience along with an equally significant amount of monetary resources to provide the most proactive support for all of our clients. While no hosting company can guarantee a breach is impossible, we can fully guarantee that we take your account security as seriously as you do and that we take many precautions to protect your data in all regards. If a hack is ever attempted against your digital property, it won’t be your use of a semicolon in a password string that saves you from infinite heartache – it will be the professional support staff that has earned your trust by caring for your accounts at every step along the way – before, during or after any threat is detected.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
03
Dec
2013

Google Increasingly Under Attack From Powerful Entities

by Bill

 google-magnifying-glassGoogle began as the company famous for the informal motto “Don’t be Evil” but it now increasingly finds itself cast as the new “evil empire” by detractors, displacing the unenviable position in tech that Microsoft occupied for so long. That may be the harbinger of many changes to the landscape in tech, medicine, mobile and the many other sectors Google has attempted to dominate.

From it’s on-again, off-again participation in internet censorship in countries under totalitarian rule, to getting caught red handed collecting sensitive personal data from local WiFi networks from their Google Street View photography cars, Google finds itself under attack from all sides of late. As the search behemoth throws its weight around, leveraging its dominance in search, it has created some very powerful enemies, and is now beset with lawsuits over its anti-competitive business practices, from providing preferential treatment in search to their own products and services, tocopyright infringement lawsuits over targeted advertising, Google’s core business.

While being one of the largest tech companies by definition means that there are going to be lawsuits and controversies that relate to presumed market dominance, the “don’t be evil” company has demonstrably crossed over the line in the minds of many and is under increasing scrutiny of its business practices as well as some of its corporate relationships by regulators, competitors and consumers who are increasingly uncomfortable with the company’s opaque business practices.

The latest regulatory action relevant to the tech giant has the FDA ordering the shutdown of 23andMe, a consumer genetic testing company in which Google had a significant stake, and one that is operated by it’s founder – who happens to be the wife of Google’s founder. With the FDA stating that 23andMe has failed to analytically or clinically validate the tests that purportedly provide an indication of whether a subject is susceptible to over 250 illnesses and diseases (from diabetes to breast cancer) the FDA’s warning letter indicates that the company also has not been responsive enough in addressing the agency’s concerns.

Reports state that 23andMe has resisted government regulation for years, arguing that it is merely providing consumers with information, and that it is not a medical service, though in the last year, they submitted several of their disease-specific tests to the FDA for validation. In a statement issued by the company, 23andMe admitted they had been slow to respond to the FDA’s questions about the tests, but in a written statement issued by Kendra Casillo, 23andMe’s spokeswoman said “Our relationship with the FDA is extremely important to us and we are committed to fully engaging with them to address their concerns.”

From snooping on the public to trying to oust major companies from their long-held core businesses, utilizing often pirated content to build YouTube into a massive traffic network and more – it would seem to many that Google’s “Don’t be Evil” slogan, is becoming an ironic albatross hung around the company’s neck as a decade of success have recast the company as precisely the kind of corporate monolith it once warned people about.

It speaks volumes that just about any story about Google in the news has worldwide implications for nearly every website, hosting company and digital business owner. The world slowly began to migrate away from complete desktop dominance by Microsoft and now there seem to be plenty of parallels in place as the world migrates away from the desktop to an ‘always on’ infrastructure that Google helped create, their grip may be loosening in much the same way.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
18
Nov
2013

A New Round of Attacks on Net Neutrality Policy

by Administrator

net neutralityThe DC Circuit Court, which is just one step down from the United States Supreme Court, is hearing arguments in a case some believe may lead to striking down the nation’s Net Neutrality Law, which was officially adopted by the Federal Communications Commission (FCC) in 2010 and has been in place since the earliest days of internet development. While the courts may yet decide to uphold the law in this particular case, many experts contend that it is only a matter of time before Net Neutrality is eventually litigated out of existence.

As conceived, Net Neutrality means that ISPs are required to provide the same level of throughput between content providers and consumers, but the owners of these “data pipes” often have their own content that they would prefer to send, or are simply looking for an additional revenue stream which would be available if they could charge site owners for feeding their bits to consumers faster, or as a way to hamper competition by throttling unaligned content providers’ throughput so as to make their in-house or allied content load faster and become more attractive to consumers instead.

As a small, but growing segment of the population “cuts the cord” and moves to receiving their entertainment through internet-based services like Hulu, Netflix and Net based providers, cable and telecom companies which have their own cable television-style entertainment packages on offer, would like nothing better than to be able to charge Netflix or other providers for premium speed and connectivity packages. Though their point of view seems to ignore the fact that consumers are already paying quite handsomely for broadband service as it is.

As previously reported here by NationalNet, Netflix accounts for more than 31% of all web traffic and YouTube accounts for nearly 19% of all web traffic. That means just from those two entities alone, half of all web traffic can expect to see worse service or higher prices if the Verizon case is decided in favor of the carriers.

The case presently before the court has been brought by Verizon, objecting to requirements that they practice nondiscrimination among websites and application sources. Based on the statements of the justices during the case’s oral arguments it would appear that the ruling will be made in favor of the phone and cable companies, while they will likely uphold the “no blocking” provision which prohibits ISPs from completely blocking access to non-preferred sites according to experts watching the case closely. That outcome would beg the question, is there really a difference between Verizon ‘blocking’ Netflix and Verizon simply making Netflix load 90% slower than a competing movie service owned by Verizon or one of it’s allies?

If the nondiscrimination rule is struck down, the implications for consumers and website owners will be profound. No longer will sites live or die on the merits of their content, as gatekeepers like AT&T, Verizon, Comcast and other major players will be able to demand content providers pay up, or have their throughput slowed and their traffic turned away in droves – presumably, straight into the arms of in-house competition or other providers that have anteed up for a better data connection. Netflix, Hulu and iTunes in particular will have their business models turned upside down, as will any other high traffic low cost content provider. Cost will undoubtedly be passed on to consumers and for those who run commercial sites, a new pay to play model becomes particularly problematic, as users have shown very limited patience for slow page loading times, with 25% abandoning a page if it takes more than 4 seconds to load.

If left unfettered by the Net Neutrality rule, ISPs would be able to throttle any site for any reason, whether to strong-arm a fee, drive users to an in-house alternative or as part of some larger scheme. When questioned about his client shaking down site owners to provide reliable service, Verizon’s own lawyer in the case stated: “I’m authorized to state from my client today that but for these [FCC] rules we would be exploring those types of arrangements.”

If the court throws out the nondiscrimination rule, costs for site owners and consumers may rise significantly while innovation gets stifled. Even if upheld, Net Neutrality is still on shaky ground as there will always be a great deal of money at stake for those fighting to get it overturned, and this a war they clearly do not intend to lose.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
14
Nov
2013

Online P2P File Sharing and Piracy Traffic Plummeting?

by Administrator

piracy downloads are droppingIn 2002, peer-to-peer (P2P) file sharing amounted to more than 60% of all web traffic, and fell to 31% five years ago according to Sandvine. Now, according to the most recent internet usage report, P2P traffic has dropped to as little as 10% of the total pool.

The shift from file sharing, which many entertainment and software companies believe is synonymous with “piracy”, has been accompanied by a full speed rise in sanctioned video streaming service traffic, with Netflix accounting for more than 31% of all web traffic and YouTube accounting for nearly 19% of all web traffic – totaling half of all downstream web traffic within North America.

It would appear that the convenience of high quality legal video streaming for a modest cost, where available, is preferred by consumers to the hassle of searching out files for free. It would also seem anecdotally that the high-profile prosecutions and campaigns to end digital piracy may have dissuaded some digital downloaders from seeking out “illegal” sources of entertainment.

Modern set top boxes, DVR convenience and other available technologies are also giving consumers easier access to the TV shows, movies, music and other digitized forms on entertainment they desire on demand. That increase in quality and convenience seems to be enough incentive enough for many to avoid risk of prosecution, when what they are looking for is already available legally for a just a few dollars.

This new paradigm is well underway in the developed economies of the Asia-Pacific region as well, with streaming entertainment accounting for 50% of downstream traffic and average data consumption at roughly double that of users in North America. In the UK, Netflix already accounts for 20% of downstream fixed network bandwidth, just two years after launching in the market.

ISPs are shifting strategies toward bundling immensely profitable entertainment products with their ISP services by attempting to eliminate Net Neutrality guidelines in a bid to bolster their own offerings or extract payments from content providers while attempting to move away from unlimited data service to charge consumers on a per-bit basis.

As the current traffic patterns cause companies to refocus their efforts, it seems the next battleground will be fought over the amount of consumption each customer can enjoy, the speed each site can provide, and the attempt by large entities to re-divide the pie – now that only 10% of the pie is being given away for free.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
06
Nov
2013

The Rockstar Consortium Sues Google Over Nortel Patent Infringement

by Administrator

The so-called Rockstar Consortium is the unlikely alliance of Apple, Microsoft, BlackBerry, Ericsson and Sony – who bought a portfolio of more than 6,000 patents for 4.5 billion dollars. From Nortel, the bankrupt Canadian telecommunications company, during an asset sell-off in 2011. A bidding process that Google also took part in but ultimately failed to win.

Last week, the consortium filed eight lawsuits in US Federal Court, accusing Google of infringing on their patent rights. In addition to Google, the other named defendants in these cases are Samsung, LG Electronics, HTC, Huawei, Asustek, Pantech and ZTE Corp., which just happens to be essentially everyone who with a major commercial interest in the Android smartphone business. In addition to going after the makers of Android devices, there is a lawsuit suit against Google targeted advertising, which remains the company’s core business. Claiming that Google’s targeted advertising and AdWords system is infringing on patents issued to Nortel for an “associated search engine.”

What makes this case particularly interesting is the fact that Google was an unsuccessful bidder for the Nortel patent portfolio, bidding as much as $4.4 billion to obtain it at the time, before being outbid by Rockstar. The aggressive way in which Google pursued ownership of the patents at the time will make it very difficult for the tech heavyweight to argue in court now that the patents are of no value or that their own business practices are completely unrelated to them.

Perhaps most importantly, it appears that battle lines are being drawn between entire ecosystems and collectives of companies at the top of the tech world. Building sites and networks that are agnostic to one particular fiefdom of technology companies or another without running a foul of their constant litigation requires vigilance on the part of business owners, IT staff and administrators. As always, NationalNet is here to help.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
04
Nov
2013

Engineers From Google, Oracle And Red Hat In ObamaCare Tech Surge

by Administrator

With the widely-publicized troubles of the Affordable Care Act’s Healthcare.gov website dominating the news for the past month, key employees of industry heavyweights including Google, Oracle and Red Hat are part of the cavalry charge to help the beleaguered website.

According to the Obama administration, Michael Dickerson, a site reliability engineer on leave from Google, and Greg Gershman, innovation director for smartphone application maker Mobomo, are key players in the so-called “Tech Surge.” Julie Bataille, spokesperson for the Centers for Medicare and Medicaid Services says: “They are working through the analytics of what happens on the site to prioritize what needs to be fixed.” The project’s management has since been reorganized, with United Health Group’s Quality Software Services unit now overseeing the entire operation, whereas previously the project did not employ a lead contractor to coordinate the effort of the many contractors and agencies involved in the implementation and deployment of the site.

While the Affordable Care Act is something of a political hot potato, Oracle CEO, Larry Ellison, took a non-partisan stance when questioned about his company’s participation in the Tech Surge at a recent shareholders meeting, stating: “We think it’s our responsibility as a technology provider to serve all of our customers, and the government is one of our customers” he said. “We are helping them in every way we can. I will refrain from editorial comments about what has happened there. I think most of us want to see our government operating effectively.”

About 8.6 million people visited healthcare.gov in the site’s first week of operation, and exposed a number of software flaws and many experienced long waits that prevented many from even getting registered. The site has been taken offline multiple times over the past month for software upgrades and the Obama Administration insists that capacity is being added to the system to handle the traffic it is seeing. Getting the site up and running smoothly, quickly is a top priority, not only for political reasons, but is also critical for the users of the site, who will potentially face fines if they haven’t chosen a health insurance plan by March 31st of this year.

The technological debacle goes beyond any political debate and serves as a shining example of the fact that having the right infrastructure and talented team of technologists in place is essential to creating any significant internet property. There is simply no substitute for building and launching your website the right way the first time, and NationalNet experts are always here to help you.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
01
Nov
2013

The End of Mobile Minute Limits

by Administrator

AT&T’s announcement last week that it would only be offering unlimited talk for all new smartphone users marks the definitive end of an era. While the US’s other major carriers, Verizon, Sprint and T-Mobile abandoned limited-minute plans for new customers some time ago, AT&T was the final big player to abandon the pricing paradigm.

It’s not a sense of largesse from the cellular carriers that has pushed the drive for unlimited voice calling, it merely reflects the market, voice calling has been declining precipitously, replaced by mobile data, in the form of texting and internet access, where customers used to make voice calls. The upcoming generation, raised in the era of the smartphone use text as their primary means of communication, reserving voice calls for only their closest friends and relatives. There has been a widespread reduction in actual minute usage, and customers have been reducing their mobile contracts accordingly.

While the move to unlimited voice is in effect locking in a fixed, though invisible to the consumer, price point for voice services, new technologies such as voice over LTE or VoLTE is on the horizon that will essentially make all mobile phone calls into data streams, and will allow the industry to recapture the pay per bit model they crave and will eventually allow them to recapture the valuable over the air channels that are currently dedicated to voice traffic.

This latest acknowledgment by carriers, following development of cross-platform operating systems like Windows 8, shows that Fortune 500 companies are finally ready to admit “mobile phones” are essentially computers that just happen to fit in your pocket, rather than telephones with some data capabilities.

Mobile now accounts for more for 25% of all digital site traffic, offers significantly higher conversion rates than traditional desktop traffic and opens up a huge world of emerging markets where affordable smartphones are penetrating much more quickly than expensive desktop devices. If your properties are not optimized for mobile already, now is the time to take action and secure future revenue streams. NationalNet offers state of the art hosting from our Atlanta data center and full collocation opportunities to help you advance your mobile presence. Always just one click or call away and ready to assist, if you have mobile connectivity questions, contact us for answers.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
28
Oct
2013

Mozilla Lightbeam Tool for Firefox Illuminates Who is Watching Web Users

by Administrator

Mozilla Lightbeam Tool for Firefox Illuminates Who is Watching Web Users With online privacy at the top of the public’s mind in the wake of revelations of the US’s NSA surveillance program, Mozilla, the open-source community behind the popular Firefox browser, has launched Lightbeam, an add-on for Firefox that will reveal just who is looking over your shoulder as you browse the internet.

Most web users have long been aware that their digital trail is being tracked, being utilized for targeted advertising. Search for hotels in South Carolina and “miraculously” you’ll be seeing ads for hotels in South Carolina popping up on sites all across your internet travels for weeks to come. Users who install and activate Lightbeam on their computers will be able to view real-time visualizations of the sites’s they’ve visited and the third-party entities that are harvesting their data for commercial purposes.

The add-on allows users to opt-in to anonymously sharing their data, which will go towards producing a “big picture” view of web tracking, revealing the activity of these third-party data aggregators. Mozilla’s executive director, Mark Surman says: “It’s a stake in the ground in terms of letting people know the ways they are being tracked. At Mozilla, we believe everyone should be in control of their user data and privacy and we want people to make informed decisions about their Web experience.” While many are cognizant of cookies installed on one’s computer when visiting a website, many are unaware of third parties’ access to those cookies to glean the interests and browsing history of browsers to build a digital picture of individual users to then use for marketing purposes.

While Firefox and other major browsers provide for the option of disabling cookies and the EU has passed “The Cookie Law,” which requires sites to explicitly state how they will be using users’ data and who they will share it with as well as receiving consent from users prior to allowing cookies to be installed on their computers. For users who have activated Lightbeam, when they visit a website, the add-on creates a real time visualization of all the third parties that are active on that page. As they then browse to a second site, the add-on highlights the third parties that are also active there and shows which third parties have tracked their presence on both sites. The visualization will grow with every site visited.

While according to Mozilla they have had “tremendous pressure” exerted on them by trade bodies who would have preferred to continue their work unobserved and behind the curtain, the group feels it is duty-bound to bring transparency to the Web, particularly in today’s climate of user uneasiness about how their data is being utilized and whether their privacy has been compromised. Site owners who have agreements with third parties that track users of their sites are advised to make sure that you are comfortable with the structure of the relationship becoming public knowledge, as we would predict that the aggregation of this data will reveal relationships that will cost some sites customers. This browser tool also bring to light just how far ad networks and commercial user tracking have come in a short amount of time. Navigating the best practices available to strike a balance between customer-focused service and privacy protection is likely to be a major point of emphasis in the months or years to come from a Department of Justice and Search Engine Optimization perspective as well.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
25
Oct
2013

NASA Laser Transmission of Data Achieves Best Wireless Transfer Speeds Ever

by Administrator

The recent opening of the movie Gravity has many looking to space with a renewed curiosity about the technological marvels that have been deployed over the years by the National Aeronautics and Space Administration (NASA). Many of the mainstream commercial technologies we now take for granted were originally developed by NASA or the Defense Advanced Research Projects Agency (DARPA) of the Defense Department. Useful tools ranging from LED lights to infrared scanning and GPS may soon have a new sibling from the same point of origin making its way toward commercial applications.

NASA announced it has set a new record for communication in space by beaming information from a substation on Earth to the LADEE probe currently orbiting the moon nearly 236,121 miles away. The LADEE probe is the home of the Lunar Laser Communication Demonstration (LLCD) device which recorded the record breaking communication at a speed of 622 megabits per second (Mbps). Most high end broadband consumer bandwidth is presently limited to around 75 Mbps by comparison and uploads of only 50 Mbps. When dealing with huge distances, the radio signal system formerly used by  NASA to communicate with lunar satellites was approximately five times slower than the new laser transmission technology.

NASA used radio waves and they are still the preferred method because they require much less ‘aiming’ to be effective, but as a target gets farther away, much more power is needed to transmit the signal. That means much larger receiving dishes and antennas on probes that get as long as 70 meters. Using a concentrated beam of light, a spacecraft may be able to send data at much faster rates carrying higher resolution images and transmitting the first 3D video of deep space.

While there is reason to be excited, the LLCD method still has many challenges to overcome, especially in the realm of “aim” because it requires a laser beam to hit a very specific target that can be thousands of miles away, while the target continues to move – and any deviation from that direct hit will result in dropped transmissions. Add in the complications of atmospheric changes, weather and so on… and it becomes easy to see that spaceships are likely to carry both a radio and an LLCD set of communication systems for the foreseeable future.

That being said, NASA intends to launch a larger more sophisticated version of the LLCD in 2017, calling it the Laser Communications Relay Demonstration (LCRD), which may provide even greater speeds and reliability. As the world continues to move toward wireless devices, mobile connectivity and a need for even greater speeds of transfer with crucial packets of information, NationalNet will continue to monitor all advancements in the communications field to keep our clients ahead of the curve. We may not be far off from relays of concentrated light based transfer systems that allow wireless devices to operate at untethered fiber-optic speeds.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
15
Oct
2013

U.S. Government Losing Its Grip On Internet Infrastructure

by Administrator

Last week the directors of all the major Internet organizations including: ICANN, the Internet Engineering Task Force, the Internet Architecture Board, the World Wide Web Consortium, the Internet Society, and all five of the regional Internet address registries met in Montevideo, Uruguay and essentially turned their backs on the US government. These organizations, tasked with development and administration of internet standards and resources are initiating a sharp break away from three decades of US hegemony in key matters of internet governance.

The Montevideo Statement on the Future of Internet Cooperation issued by the conference calls for the “acceleration of the globalization” of the functions carried out by ICANN and IANA. The move is being seen as a stern rejection of the current arrangement in which the day-to-day operations of the Internet’s underlying infrastructure have been supervised by the US Department of Commerce since their inception.

This kind of tectonic shift in the governance of the internet is unprecedented and may be just the beginning, as the very next day, the President and CEO of ICANN, Mr. Fadi Chehadi, met with Brazilian President Dilma Rousseff, a long outspoken critic of the United States’ internet dominance in general, and electronic spying programs in particular.

According to the official statement of the Brazilian government that was issued after the meeting, Mr. Chehadi asked: “the president [of Brazil] to elevate her leadership to a new level, to ensure that we can all get together around a new model of governance in which all are equal.” The statement goes on to call for an internet governance summit to be held in April of 2014 in Rio de Janeiro which seems to cement a path toward internet governance that will occur without the direct oversight of the US government. A change that many believe was at least in part precipitated by facts uncovered by Eric Snowden regarding the NSA Prism surveillance protocols and examples of US government interests using the internet to spy directly on the email or other communications of sovereign nations and their leaders.

This potential loss of US control, with no replacement oversight authority in place to take on the responsibility is an uncertain development. Writing for the Internet Governance Project, Milton Mueller, a Syracuse University professor, stated: “The proper substitute for unilateral Commerce Department oversight, we argued, was not multilateral “political oversight” but an international agreement articulating clear rules regarding what ICANN can and cannot do, an agreement that explicitly protects freedom of expression and other individual rights and liberal Internet governance principles.” In the past, there was but one master and while the existing setup certainly had it’s drawbacks, its replacement will likely be chaotic as various newly-empowered and disparate political interests jockey their agendas.

Domain names, TLDs, IP protocols, the DNS system and just about every other meaningful aspect of the nuts and bolts technologies the internet is built upon may all soon be governed in a completely different way.

There has also been no official word from the United States government or the commerce department about this issue, which may be at least in part the result of a continuing government shutdown which has caused all but essential government employees to be furloughed why vital matters of importance like this one appear to simply be falling through the cracks.

NationalNet will continue to monitor these changes closely and will report any proactive steps our clients can take to solidify their own position on the world wide web as governments and quasi-governmental groups continue to bicker over the technology, privacy and protocols that underlie a network our entire world now relies on for essential needs on a daily basis.

Share and Enjoy
  • Print
  • Facebook
  • Twitter
  • Add to favorites
  • RSS
  • Google Bookmarks
  • Technorati
  • Yahoo! Buzz
NationalNet, Inc., Internet - Web Hosting, Marietta, GA
Apache Linux MySQL Cisco CPanel Intel Wowza