subscribe: Daily Newsletter

 

Don

0 comments

Make no mistake: the corporate network is usually the first area of a business to be overlooked – we only take cognizance of it when it stops working. But the repercussions of downtime represent a very definite, very lasting reminder of its importance, writes Paul Luff, country manager at SMC Networks South Africa.

Just recently I was called in to assist a client who had to upgrade firmware, re-adjust routing tables and conduct maintenance to optimize the performance of a switch.
This could only be done on a Sunday between the hours of 10pm and 2am. Any other time frame would have incurred substantial cost to the company as a result of downtime and loss of productivity.
This is simply one example of how important the corporate network has become in business and commerce.
All one really needs to do to severely disrupt any substantial business – any organization or company that is worth their oats, running five or more PCs on a network – is to turn off their distribution point or disconnect access to the Internet.
The reason is simple – remove the PC and you remove a single point in connectivity. But remove the network and you bring all productivity to a grinding halt.
One of the common pitfalls affecting decision makers is the knee-jerk reaction of investing in technology and technically throwing money at a network or infrastructure problem.
Many companies misappropriate their IT spend and aim it at the wrong places. This is particularly evident in the skills factor in network administration and management. This area is more than often undercapitalised as decision makers negate the necessity of good networking advice and support.
Business administrators or IT managers must understand the implications of lack of communication, whether external or internal. Companies must budget proportionately to ensure uptime.
Let us put this into context: businesses spend a huge amount of money on marketing (including advertising) and research. But without the means to communicate and all of this opportunity amounts to very little.
The significance of the network in the market is highlighted by the increased levels of innovation and rollout of technology in this space.
Noteworthy trends include the proliferation of 802.11n, which adds a new dimension to wireless in the form of stability and affordability. Whilst it will never replace a hard wired network, it will add a lot more value than was previously envisaged with normal wireless.
The fact is that anyone who claims that wireless is now upon us is way behind the times. This technology has been around for at least ten years, it is now simply more accessible to smaller users and the technology is now more understood and trusted. It is also being implemented readily in the right applications.
As for 10G – this technology is there, it is being sold and being implemented a higher rate than many people realize. However, it is currently still a backbone technology and a high-end server means of communication to the network.
It is still expensive and, while costs will eventually come down, growth will be limited until a more cost effective form of delivery of 10G to the desktop is found.
One of the critical issues affecting the adoption of 10G is that existing cabling infrastructure does not cater for this technology from an economic point of view. Fibre can accommodate it, but very few companies/businesses are prepared to invest that amount of money.
This is unlike the case of gigabit, which continues to expand through the requirement for speed and a reduction in price.
Ultimately, if implemented correctly and optimized appropriately, gigabit is still by far the most effective route to go. Businesses with fully fledged and competent IT support infrastructures are making the best use of gigabit.
In terms of networking infrastructure and a company’s ability to capitalize off this technology, a straightforward rule applies as far as financial/ economic investment is concerned: work smarter, not harder.
Too often decision makers chase the acquisition of new technology to manage what are actually ill-managed environments. The motivation is that fresh technology will solve issues like speed, like performance – but this new technology actually complicates the situation and ends up being more of a burden.