Wednesday, May 11, 2016

Next generation hotspots: The future of Wi-Fi?








In the home, in the enterprise, in public spaces and increasingly, in “Internet of Things” applications, WiFi has come to dominate how end users connect to the broader network as they seek good speed, throughput and low cost for their connections. For the first time in 2015, Wi-Fi carried more mobile network traffic than cellular networks. Cisco’s Visual Networking Index for mobile pegged the percentage as 51% of overall network traffic being offloaded from cellular to the wired network via Wi-Fi and femtocells. 

That trend is only expected to continue, as by 2020, Cisco predicts mobile offload will increase to 55% of overall mobile data traffic, which at that point will mean about 38.1 exabytes per month, up from 3.9 exabytes per month in 2015. Wi-Fi is also stepping up its game through expanded features and capabilities. It’s expected that 802.11ac’s second wave of features will see Wi-Fi Alliance certification this year, with the first certified devices designed to support gigabit WiFi speeds – also known as WiGig or 802.11ad – set to appear this year as well. Wi-Fi also has a new standard specifically for IoT applications known as W-Fi HaLow. Meanwhile, vendors continue to push for expanded adoption of Wi-Fi Alliance’s Hotspot 2.0, commercially known as Passpoint, which enables seamless authentication to make Wi-Fi act more like cellular.

As Wi-Fi expands its features and reach though, the technology is also encountering new challenges. Its very popularity has resulted in major congestion in the unlicensed 2.4 GHz band, and the Wireless Broadband Alliance’s 2015 annual report said some operators fear the unlicensed 5 GHz band could become intolerably congested in as little as two to three years. One of the biggest question marks is co-existence and how well Wi-Fi will fare on a continued convergence toward cellular, with the advent of LTE in unlicensed bands using License Assisted Access and LTE over Unlicensed – a question which industry players are trying hard to answer. 


Future trends in Wi-Fi, including standards work; the controversy around LTE-U and potential impacts to Wi-Fi; spectrum issues; and the role the technology is likely to play for service providers in the home and other offload points, including monetization tactics based on analytics, services and IoT.

Extracted from: The future of the Wi-Fi by  Kelly Hill

Contact us: info@liteglobal.com

Tuesday, May 10, 2016

Why Do I Need Managed Services?




This is an important question to ask when you have a business full of computers, servers and employees. “What will I get with managed services that I couldn’t get with a break-fix solution?” Information technology (IT) systems are expected to meet high standards of operation, while offering 24/7 availability, security, and performance

In today’s environment, you have to keep pace with the constant changes in IT, performance demands, and pressure to deliver competitive IT functionality. To meet these challenges, many organizations consider outsourcing their IT activities to be an attractive option.

We find managed services everywhere. Most technology companies use the term “Managed Services” as a service. If you are like most non-techies, managed services may cause you to draw a blank. But, if you are a small business owner, it is definitely something you need.

Organizations are increasingly turning to managed service providers (MSPs) to handle elements of their IT needs as part of a collaborative arrangement with the internal IT department, according to new research from IT industry trade association CompTIA.

Companies have become more familiar with managed services and are turning to them for management of certain IT functions, particularly email hosting, customer relationship management (CRM) applications, storage, backup and recovery and network monitoring.

"While one-time projects account for some of these engagements, a significant portion is ongoing management of one or more IT functions by a managed services provider. "There is a much higher degree of familiarity with the term 'managed services' and greater adoption."




Monday, May 9, 2016

Security and Virtualization in the Data Center





Threats facing today IT security administrators have grown from the relatively trivial attempts to wreak havoc on networks into sophisticated attacks aimed at profit and the theft of sensitive corporate data. Implementation of robust data center security capabilities to safeguard sensitive mission-critical applications and data is a cornerstone in the effort to secure enterprise networks.
The data center security challenges does not stop there. New application rollouts, virtualization, and an increasingly transparent perimeter are converging to drive an evolution in the requirements for data center security architectures.
Application rollouts bring there own set of challenges for securing communications and applying security policy—couple this with a virtualized environment and the challenge of policy enforcement and visibility increases many times over.
Traditionally, the perimeter has been the proverbial shield to stop malicious and unwanted outside traffic from leaking into the Enteprise network. Creating a secure perimeter is still valid and essential in defending against attacks and providing traffic filtering. But the amount and type of traffic entering the enterprise network has increased and continues to do so. Extranet connections for business partners, vendor connections, supply chain transactions, and digital communications all required more openings to be created at the perimeter to allow communication. Permitting these business-driven openings creates greater opportunities for attack and elevates the risk to a network.
In addition, attack vectors have moved higher in the stack to subvert network protection and aim directly at applications. HTTP-, XML-, and SQL-based attacks are useful efforts for most attackers because these protocols are usually allowed to flow through the enterprise network and enter the intranet data center.
Virtualization is driving change in the way data centers are being architected. Server virtualization is becoming a prevalent tool for consolidation, power savings, and cost reduction. It is also creating new challenges for infrastructure and security teams to be able to provide consistent levels of isolation, monitoring, and policy enforcement-similar to what is available for physical servers and systems today.
Device virtualization is providing new design opportunities and options for creating flexible data center architectures. Features that provide control plane and data plane isolation are offering a multitude of design options for device placement, Layer-2 and Layer-3 designs, and service integration.

The data center provides the critical application services for business operations. New architectures that leverage device and server virtualization are enhancing the data center capabilities while increasing the availability of these services. Using careful planning and best practice techniques, integrating security with these next-generation architectures can support this effort—without creating a hindrance. By properly planning and leveraging these new capabilities, scalable security solutions can be leveraged to increase service availability and create a more secure environment for the critical information residing in the data center.


Contact us at: info@liteglobal.com