
An effective BYOD plan must balance control with convenience. Here's what to keep in mind.
Managers often believe a bring-your-own-device (BYOD) strategy is a silver bullet to solving mobile communication problems within their organization. Thoughts of "I don't need to purchase hardware for employees" or "Workers are more productive with their own device" can mask the challenges that often accompany BYOD programs. Today's business environment is becoming a target for data breaches and various security risks, so organizations cannot afford to overlook security when developing a BYOD strategy.
However, there's a fine balance when implementing BYOD security regulations -- you don't want to be so overzealous about security that employees' work is hindered. Done right, BYOD can reduce technology expenses while increasing end users' productivity and improving office morale. An optimal enterprise mobility strategy provides comprehensive device security without impeding employees' pace of work.
For example, many companies have traditionally forced users to connect with a VPN before accessing company resources. On mobile devices, that process is a real pain. It's also not practical -- since most users switch between work and personal tasks, it actually discourages users from staying connected and productive. Companies can implement in-app VPNs and Micro VPNs, which automatically connect specific apps to the corporate network without requiring users to make that connection manually. Companies can also distribute secure browsers that allow users on to internal links that automatically connect to Intranet sites or web application servers without manually launching and connecting with a VPN.
Without a well-designed and unified device management strategy in place, companies risk exposing their most sensitive data to outside sources -- and even competitors -- while stunting employee innovation. Here are three ways to create a plan that maximizes the benefits of BYOD while mitigating the threats.
1. Be transparent with employees.
Attempting to hide unflattering aspects of a BYOD plan can backfire if employees discover them. Being truthful about employee privacy rights and enterprise mobility management (EMM) components fosters a sense of trust between decision makers and their corporate team. We see this often with companies we work with: They explain that the technology is designed to protect and secure, but that it may collect employees' personal location information and personal apps. Be clear that you're not trying to play Big Brother, and that there are privacy filters installed to restrict access to most personal identifiable information (PII).
Building BYOD trust works both ways. CIOs and company leaders should feel confident that their employees are responsibly embracing the freedom of enterprise mobility -- and if at any point the leadership team feels that workers are not handling company data securely, they have the option to implement stricter BYOD controls.
Anticipate and embrace the changes the Internet of things will bring or it will do more harm than good.
The concept of the Internet of things (IoT) dates back to the early '80s when the first appliance, a Coke machine at Carnegie Melon University, was connected to the Internet to check its inventory to determine how many drinks were available. But IoT wouldn't become practical until IPv6's huge increase in IP address space allowed us to assign an IP address to every "thing."
The emerging IoT market we see now is all about a new way of connecting people with products and how products will connect with each other. Before long there will be more "things" on the Internet than people, according to Gartner, with over 26 billion connected devices by 2020. Investors are taking note – pouring $1.1B in financing across 153 deals across the IoT ecosystem in 2013, a rise of 11% year-over-year.
While much opportunity and innovation will result from IoT, there's a dark side that should be addressed early on in the adoption cycle.
The dark side: privacy and security
The increase in the number of smart nodes brought on by IoT, as well as the amount of data the nodes will generate, will only increase concerns around data privacy, data sovereignty, and data security. Additional challenges will include understanding how devices will effectively and securely transmit and store these huge amounts of data. New messaging protocols like MQTT (messaging queuing telemetry transport) will become available to transmit the data securely.
[IoT scenarios that appear disposable hold broad business opportunities. Read The Internet Of Small Things Spurs Big Business]
If it's online, it's vulnerable. With IoT, we're entering an age where hackers can not only break into government agencies and corporations and routinely perform identify theft, but also target connected houses and cars. It's one thing when your PC or phone acts up, but what do you do when you can't turn on your lights, open your door, or turn on the heat?
Security for IoT has been a concern since the arrival of RFID technology so addressing security early on in the implementation stage will be key to safe and practical IoT adoption. When the US State Department first distributed US passports with RFID tags, passport data could be read from 30 feet away using equipment available on eBay for $250. This required changes to secure the RIFD tags. But security and data privacy risks associated with IoT will still remain. If everything is connected to the Internet, in theory anyone can see what's going on anytime they want? What if your connected car is detected at the golf course on a day you called in sick to work?
While some may argue that smartphones have already taken us there, at least you can turn your phone off. Contextual data, like location tracking, can fundamentally undermine privacy if not managed correctly. To do that requires a combination of policy and technology.
Really, really big data
If you thought you had big data prior to IoT, you ain't seen nothing yet. The enormous number of devices, coupled with the sheer volume, velocity, and structure of IoT data, will create challenges in storing, processing and analyzing the data. For enterprises to get the bountiful insights into customer activity that IoT promises, all the data needs to be stored and analyzed somewhere.
Companies should consider using one of the database as a service (DBaaS) offerings to facilitate data ingestion and management. The quicker enterprises can start analyzing their data the more business value they can derive.
Technology is great if you know how to use it
Does anyone worry that a world where everything has a sensor connected to the Internet may be a world that's too complex for its own good? If we couldn't figure out how to operate our VCR or wireless router, how can we figure out how to debug error messages when our cars, refrigerators, and sneakers are wired to the Internet? Is it possible we are on the path to create a world where many of the things we have won't work and a majority of the population won't know how to fix them?
View the original article here
United Nations' Global Pulse initiative partners with social data provider DataSift to analyze worldwide opinions on vaccinations, poverty, and other humanitarian topics.
United Nations data scientists working in New York; Jakarta, Indonesia; and Kampala, Uganda, will use data aggregation and filtering tools from DataSift to study attitudes of people from around the world, the organizations announced Tuesday.
Researchers from Global Pulse, a UN innovation lab that uses big data to foster sustainable development and humanitarian efforts, will use the DataSift platform for social data analysis across a variety of projects, including those pertaining to poverty, hunger, vaccinations, and other pressing issues, according to Jason Rose, DataSift's senior VP of marketing.
"People often think of social data analysis as useful primarily for monitoring brand sentiment and doing customer service," Rose said in a phone interview with InformationWeek.
[Are you doing the right things with your big data? See Healthcare Data Analytics Gone Wrong.]
DataSift hopes the UN Global Pulse partnership will change that perception. The company processes 2 billion social interactions per day across a variety of global platforms, including Twitter, Tumbler, and WordPress.
"Our engine is designed to take and filter those down to [several] thousand, or whatever the number is that matters to a particular topic area or organization," Rose said.
Global Pulse will use DataSift in a variety of ways. One project will examine attitudes toward immunization in order to improve the success of vaccination campaigns and to prevent the spread of disease. Another will explore how families cope when faced with sudden spikes in food prices.
The streaming-data tool helps UN data scientists quickly discover and analyze emerging trends.
"This has enabled us to create new techniques and methods that give development practitioners the valuable insights needed to improve the well-being of communities all over the world," said Global Pulse director Robert Kirkpatrick in a statement.
One potential use is early detection of disease outbreaks, as well as identifying issues and attitudes toward immunization, said Rose.
Using online data to gauge real-world problems can be tricky, however. When Google launched its Flu Trends website in 2008, for instance, it claimed the service -- which analyzes aggregated search queries -- could detect disease outbreaks faster than global health agencies. But during the 2012 to 2013 flu season, Google Flu Trends overestimated the incidence of flu in the US -- a result of its algorithms not being tuned to account for "heightened media coverage" of flu outbreaks, Google said.
But "that's not the work DataSift is doing," said Rose. Rather, the DataSift software allows the Global Pulse data scientists to apply their own algorithms.
"They are maintaining, updating, tweaking, and testing their algorithms based on the information we're feeding them," he added. "We also have a historic data store. For many of our data sources, you can go back three or four years and back-test all of your algorithms against known outbreaks and events that have happened in the past, and calibrate your algorithms that way as well."
Rose also noted that Twitter isn't the only source of DataSift's social data stockpile. "WordPress is another key source," he said. "Twenty-two percent of the world's websites are running on WordPress."
The UN effort shows the potential of social data analysis, he added.
"We are scratching the surface of the applicability of social data, and I applaud nonprofits like the Global Pulse initiative that are going out there and applying it in new and innovative ways that are doing good in the world," Rose said.
View the original article here
Microsoft pledges to do better after frustrating customers with last week's Exchange Online and Lync Online outages. Microsoft Office For iPad Vs. iWork Vs. Google(Click image for larger view and slideshow.)
Microsoft has provided more details to explain the outages suffered last week by its Exchange Online and Lync Online hosted services. Some customers were unable to reach Lync for several hours Monday, and some Exchange users went nine hours Tuesday without access to email. Many customers took to Microsoft's online forums and social media accounts to voice displeasure, not only at the service outage, but also at Microsoft's handling of the situation.
In a blog post, VP of Office 365 engineering Rajesh Jha said both outages affected Microsoft's North American data centers but that the issues were unrelated. "Email and real-time communications are critical to your business, and my team and I fully recognize our accountability and responsibility as your partner and service provider," he wrote.
[Microsoft VP predicts the cloud will evolve into just a few big players. Read more from the Structure conference: Cloud Trends To Watch: Structure 2014.]
Jha said the June 23 Lync Online disruption stemmed from external network failures that caused a short loss of client connectivity in Microsoft's data centers. The connectivity problem persisted only a few minutes, but Microsoft claims the ensuing traffic spike caused networking elements to become overloaded, which led to some customers' extended service issues.
The June 24 Exchange Online disruption, meanwhile, was caused by a periodic failure that caused a directory partition to stop reacting to authentication requests. Jha said "a small set of customers" lost email access altogether, and that others -- due to another, previously unknown flaw -- experienced email delays. Jha did not divulge how many customers were directly affected by Exchange Online's root error, nor how many dealt with the larger ripple-out effects.
The Exchange outage was compounded by a problem in Microsoft's Service Health Dashboard publishing process. The dashboard indicated to some customers that their services were fully functional, even as those services refused to load.
Jha said Microsoft has a full understanding of the problems that caused the disruptions, and is "working on further layers of hardening" to protect against future outages. He said customers can expect a Post-Incident Report in their Service Health Dashboards. Jha promised it will contain a detailed analysis of what went wrong, how Microsoft reacted, and how the company plans to avoid similar problems going forward. Though Jha's failure to detail how many customers were affected doesn't suggest a particularly transparent tone, Microsoft has a good record for sharing technical details following a service disruption.
Though Microsoft's cloud products experience few outages, this week's problems demonstrate why service lapses can be a big concern when they occur. Microsoft, Google, and others want companies to use cloud services to handle data and applications that have traditionally been hosted and managed in-house. The big cloud players have made progress over the last year, but all it takes is one outage to make professionals reconsider whether they want essential data and services to be handled by a third party.
During Tuesday's Exchange outage, a number of customers made such concerns abundantly clear. Microsoft didn't acknowledge the problems, which started around 6:00 a.m. EDT, for several hours. Even then, communications were labored; the company relied on user forums and social media to spread the word, which, given the Service Health Dashboard problem, left some customers confused and frustrated. Some criticized the company for euphemistically calling the disruption a mere "delay" in email deliveries.
"If by 'delays' you mean 6+ hours of complete outage," wrote Twitter user JD Wallace in response to a Microsoft tweet that acknowledged some Exchange customers were "experiencing email delays."
Others complained that Microsoft was slow to estimate when service might be restored. Some customers said they waited more than hour to talk via phone with Microsoft reps, only to be given no new information.
"Microsoft needs to work more with us. IT people are getting crazy without having [anything] to tell our users," a user with the handle JanetsyLeandro wrote in an Office 365 community forum. "We need a real update... [It's] causing a big problem to our business."
Time will tell whether the service outage affects the momentum of Exchange Online, Office 365, and other Microsoft cloud products. Was your business hit by last week's outages, and were you satisfied with Microsoft's response? Let us know in the comments.
Microsoft no longer supports Windows XP, but 25% of PC users still use it today -- twice as many as use Windows 8 and 8.1.
You've got to hand it to people who still use Windows XP -- they're a resolute bunch. Microsoft stopped supporting the enormously popular OS almost three months ago, but according to the newest figures from Web tracking firm Net Applications, more than a quarter of PC users still relied on XP in June.
XP remains resilient despite Microsoft's multi-year upgrade campaign, which included frequent reminders that the OS would become vulnerable to malware, and even a zero-day scare shortly after Microsoft ceased support. The latter pressured the company to issue a security fix in a "one-time exception" to its support policy.
[How are Surface Pro 3 early adopters using Microsoft's new tablet? Read Microsoft Surface Pro 3: Customers Speak.]
Net Applications, which scans a network of 40,000 websites and 160 million unique users each month, found that Windows 7 remained the top OS overall, with 50.55% of PC users. That was up meaningfully from 50.06% in May and 49.27% in April. Retailers and OEMs have deemphasized Windows 7 in their consumer offerings, but Net Application's new numbers reinforce that among businesses that recently upgraded from XP, most chose Windows 7.

Windows XP remained the second most popular OS by a large margin. It snared 25.31% of users in June -- basically flat compared to May. XP commanded 37.17% of the market in June 2013 and more than 29% in January, which means that millions of XP users have indeed upgraded to newer platforms.
But millions still remain, and XP's rate of attrition is slowing. Some businesses are paying Microsoft for extended XP support, and many third-party security vendors offer XP-oriented products and services as well. Consequently, not all of the XP traffic, which is drawn from users who connect to the public Internet, necessarily represents the same security risk.
Windows 8 and 8.1's flat-lining growth speaks to why Microsoft's older OSes remain so popular: Existing Windows customers simply haven't felt compelled to upgrade. In June, Windows 8.1 accounted for 6.61% of the market, up a bit from 6.35% in May. The original version of Windows 8 held 5.93%, down from 6.29% the month before.
With almost 53% of the combined Win 8/8.1 user base, Windows 8.1 achieved its greatest share to date. That said, it remains somewhat puzzling that more Win 8 users haven't moved to 8.1, given that the update is free and earned better reviews than its maligned predecessor. Moreover, Win 8/8.1 combined for only 12.54% of users, which was not only less than half Windows XP's share, but also down from May's 12.64%.
Apple's top performer was Mac OS X 10.9 (Mavericks), which is the current version. It grabbed 3.95% of users. That accounted for almost three-fifths of the overall Mac user base. Fewer than 14% of Windows customers, in contrast, have moved to Windows 8 or 8.1. Apple CEO Tim Cook made a similar comparison last month, arguing that Mac customers find OS X more appealing than Windows customers find Windows 8 or 8.1. That might be true, but Net Application's numbers still point to concerns for Apple.
Macs accounted for only 6.73% of PC users overall -- Apple's worst share so far this year. There's a certain amount of noise implicit in Net Applications' sampling methodology, but Apple's computers appear to have lost steam in recent months. OS X's June share was down precipitously from 7.39% in May, and 7.62% in April. The company recently introduced low-cost iMac and MacBook Air models, but Apple this year has otherwise left its consumer PC lineup relatively unchanged. Some would-be customers might have grown impatient and moved on, and some might be waiting for product refreshes.
Windows accounted for 91.53% of the market in June, up from 90.99% in May. Still, Microsoft is surely concerned by Windows 8 and 8.1's stalled growth. Newer, flashier devices could help, including the Surface Pro 3 as well as the fleet of svelte, powerful Ultrabooks recently introduced at the Computex trade show. But Microsoft is moving toward a cloud-focused future, while most of its Windows customers stick with OSes rooted in the past. Based on the most recent rumors, the company will attempt to reenergize the user base with Windows 9. It will allegedly run differently on different types of devices, and will include a desktop-oriented version in which the tiled Start screen is disabled by default.
View the original article here
To multitask in Windows 8, you had to jump between drastically different UIs. But Windows 8.1 changes that: Get more productive using these tips.
Microsoft Office For iPad Vs. iWork Vs. Google
Heading into July, just under half the combined Windows 8 and 8.1 user base is still using the first version of the new operating system. That's puzzling. After all, critics and users both trashed Windows 8, and Windows 8.1 Update has earned significantly better marks.
Some of the resistance to Windows 8.1 can be explained. A number of users have experienced update problems, the most extreme and long-running of which simply disallows Windows 8.1 from installing through the Windows Store. Microsoft is working on a fix. But the problem isn't widespread enough to explain why half the Win 8/8.1 user base has stuck with the maligned original version.
New device sales contribute to Windows 8.1's share, which means that among Windows 7 users who upgraded to Windows 8, a huge number -- perhaps over half -- have ignored upgrades. These people use non-touch machines, which only makes their hesitancy more baffling. Whereas Windows 8 is awkward for mouse-and-keyboard users, Windows 8.1 Update works well on both touch and traditional hardware.
[Does Microsoft finally have a winning tablet? Read Microsoft Surface Pro 3: Customers Speak.]
Some industry watchers, including those with ties inside the company, have said the Windows 8 brand is tarnished beyond repair. The operating system's poor reputation explains, or so the commentary goes, why Microsoft is allegedly barreling toward a new Windows version codenamed Threshold. Likely to launch as Windows 9, it reportedly will restore the Start menu to the desktop interface and de-emphasize Live Tiles for non-tablet devices, among other major changes. Have Windows 8 users become so disenchanted they have simply lost faith in Microsoft and are dismissing subsequent updates?
Whatever the reason for hesitancy, if you're still using the original version of Windows 8, especially on anything other than a traditional tablet, consider giving Windows 8.1 a try. No, it's not perfect, but it's miles ahead of the Frankenstein-esque original edition, especially if you're a multitasker.
To multitask in Windows 8, you had to jump between drastically different UIs, but the newest versions offer a much more cohesive and productive experience. Whether you're new to Windows 8.1 Update or an experienced user looking to hone your multitasking skills, we've got you covered. Here are five tips to get you started.
1. One person's tool is another person's distraction.
If you didn't like Windows 8's changes, the OS didn't give you many options. Want to boot directly to the desktop? Too bad. But Windows 8.1 Update is much more flexible. It not only recognizes whether it's running on a tablet or PC and attempts to choose the right settings, but also gives you plenty of options to customize the interface to your preference. With a few minutes' work in PC Settings, you can enable or disable a variety of features, such as boot-to-desktop mode and smart corners. If you want a touch-centric Tile interface, you got it. If you want Windows 8.1 Update to act like a faster, more secure version of Windows 7 (minus the Start menu), you can more or less do that, too.
There are several ways to get started. From the Start screen, you can click the new PC Settings Live Tile, or activate the Charms menu (swipe from the right of a touchscreen, or mouse to the top-right hot corner) and select Change PC Settings. Once you've reached PC Settings, choose PC and devices, which includes a variety of personalization controls.
2. Use the taskbar to switch between legacy and Modern apps.
If you use both Modern and desktop apps, the taskbar makes a great navigation center. In Windows 8.1 Update, you can pin both types of apps to the taskbar. By launching apps from the taskbar (instead of, say, the Start screen, or a desktop shortcut), you'll save yourself the disruption of jumping between the desktop and the tiled Start screen.
Contraceptive implants are nothing new, but the current generation of progestogen-releasing devices need to be replaced every three years and have to be removed if you want to try for a baby. That may change soon, however, now that the Gates Foundation is backing a Massachusetts biotech company to build the next generation of implantable devices. MicroCHIPS Inc. is building a wirelessly controlled implant that slowly pumps out drugs and could, theoretically, only need replacing once every 16 years.
MicroCHIPS has been testing the "intelligent drug delivery system" with osteoporosis patients who would otherwise require a daily barrage of injections. Bill-and-Melinda Gates and MIT's Robert Langer, however, believe that the technology could solve the family planning crisis that exists in the world's poorest countries. Reservoirs of levonogestrel, a contraceptive hormone would be kept inside the 1.5cm device, and could be activated and deactivated at the whims of the user with some sort of wireless device. Currently in the experimentation stage, the team hope to solve the issue of security -- to prevent anyone but the user controlling the system -- before submitting it for FDA approval at some point in the near future.
Via: CNET, MIT Technology Review
Source: microCHIPS
View the original article here
There was a time when companies preferred to maintain their hardware with OEM, but rising cost has brought about an alternative. Today more and more companies are turning to TPM (Third Party Maintenance) as it takes a load both financially and functionally off their shoulders. However, there are still some companies who are yet to be convinced of the effectiveness of the decision and here are top 5 reasons why one should choose TPM.From all angles, TPM actually serves to maximize the value of a business and there are many ways that it is done. First of all is the team that comes with any TPM, which usually comprises of highly qualified individuals which will save the company from having such professionals on their own payroll. The team will have people who have specialized in OEM and will help cut costs by consolidating a wide range of OEM vendor services. Most OEM comes with a limited range of services and the company will usually have to have additional service providers. However, TPM can have a broad range of services and are flexible when it come to dates on contracts so that multiple contracts end on the same date, thus providing ease of maintenance for the company.
The second reason that one ought to consider is the single point of contact brought about by TPM. You will not have to worry about whom to call when an emergency breaks out and there will also be no blame shifting. The third reason cost effectiveness. A TPM is there to maintain equipment unlike OEM, which tries to sell new equipment. As TPM has no such objectives, they will do their best to see to it that your equipment will last for as long as it can survive. The fourth reason is one of convenience as OEM will only service the equipment of a specific manufacturer, while TPM are qualified and skilled to service all products without regard to any specific manufacturer.
Finally, perhaps the most important reason, the speed in which the TPM responds. When there is an emergency, OEM has an extensive hierarchy that one has to traverse in order to get service. However, with TPM, all it takes is just one call for the company to get back on its feet.
The bandwidth needs of the enterprise have grown exponentially over the past few years, and that growth isn’t showing any signs of stopping. Cloud services, mobile access and increased video needs will place even greater demands on the enterprise’s bandwidth. Gartner group offers a look at just how much those demands will increase between now and 2017, and offers four ideas about how the enterprise can adapt its networks to meet the growing demands for more and more capacity.
Here are the immediate challenges that the enterprise is going to face, according to the conclusions of a recent Gartner report:
Greater cloud demands. Users with desktop and mobile devices are going to demand an increasing amount of bandwidth usage. Uplink capacity, which has traditionally been much lower than downlink capacity, will need to be ramped up to meet this greater demand. This will depend, to some degree, on the particular needs of the enterprise’s applications.More devices. In addition to accessing cloud resources, the number and kind of personal machines, devices and other things that will connect to enterprise networks is going to increase, adding to the overall background traffic in the enterprise – even during idle periods.More and more video. Video use – for both personal and professional reasons – has been rapidly increasing over the past few years. The increase will continue, but at a more modest rate than in the past. HD video via WAN will increase bandwidth needs.
There are also some recommendations in the report that enterprises can use to be prepared for the next three years:
Talk about the growth of network use and bandwidth with your users and your business units. This is especially important for cloud and video applications. Budgets should be linked to policies and usage agreements.Policies and technical mechanisms should be implemented that will help to either optimize or even limit the video and backup traffic that occurs via the WAN and on the cloud.Implement network capacity, topology and service levels. This will allow the enterprise to support its growth with affordability. VPNs, for example, can support smaller sites and offload certain types of traffic.Reduce stored backup and video traffic through various WAN optimization strategies. Develop those strategies with cloud-hosted content in mind, rather than with content stored in your data centers.
Increasing bandwidth demands aren’t a new problem, and they’re not going away. Make sure your enterprise is ready to deal with the coming changes over the next few years.
During a period of slow global economic growth, the opportunity presents itself for emerging industries to gain recognition and a top spot in place of failing businesses. In the scenario of our most recent worldwide economic slowdown, IT has come out on top as a powerful driver of growth in nearly every developed country.
Newer technological innovations can reshape and transform economies. Take digitization – the mass adoption of connected digital services by consumers, enterprises, and governments – for example. According to the Global Information Technology Report 2013 by the World Economic Forum, digitization sent world economic output soaring by nearly $200 billion and added 6 million jobs in 2011.”The report highlights how digitization also bolsters improvements in GDP per capita and can certainly contribute to a lower unemployment rate.
Technology innovations change the way we as human beings live and work, and introduces new ways of doing business. They can also create new opportunities, such as international cybercrime and privacy issues, that society and policy makers need to wrestle with. As a whole, we should try to balance the risks and rewards of the new technologies that emerges around us. Even with the unintentional consequences that come along with new technologies, it is certain that exporting and sharing them worldwide is a force for good. The absolute benefits of a more connected, smaller world far outweigh the inherent risks.
As we look forward over the next ten years, there are a number of technologies that could have a massive, economically disruptive impact. I recently read the McKinsey Global Institute’s 2013 Disruptive Technologies report and was intrigued by the list of the top 12 technologies that they believe will “transform life, business and the global economy.” According to the report, these technologies share four key characteristics: high rate of technology change, broad potential scope of impact, large economic value and substantial potential for disruptive economics. They include:
• Mobile communications and internet
• Knowledge work Automation
• The internet and associated technology
• Cloud technology
• Advanced robotics
• Autonomous vehicles
• Energy storage
• Advanced materials
Another common theme gluing these technologies together is human advancement. These highlighted areas have the ability to bolster the global economy and make our day to day lives easier. But, they have the potential to introduce new challenges and risks that we as a society need to solve.
Take the progression of robotics, for instance. The further computerization of a few workforces will make a streamlined workflow that can produce a steady item at an institutionalized rate without human work. The profits are clear and selection of this engineering is by and large useful for our worldwide economy. Nonetheless, the human expense, for example, lost employments, can't be thought little of. We'll have to meet up as a social order – of purchasers, business pioneers and arrangement producers – to illuminate these tests and guarantee that we're profiting from the reception of these new innovations.
While the overall international impact of these innovations might be clear, what does this mean for your company?
Businesses should be focusing their efforts on the technologies that can have the biggest impact to their business. Competitive advantage often comes to early adopters who leverage technology to transform their business. As newer, mind boggling, technologies come to market, IT should move to quickly evaluate and implement those that make sense. And, they need to experiment with those technologies that they may not need right away.
As the world gets more diminutive and more associated, companies need to guarantee they have a worldwide R&D method. For a long time, numerous organizations were concentrated on offshoring R&D to India. Today, tech hotspots are popping up around the globe in spots like the U.K., Tech City, Germany, Singapore and Brazil. The key for businesses is to influence the pools of enhancement as far and wide as possible and set up shop in nations where capable designers and researchers can collaborate specifically with clients and prospects.
Clearly, technology innovations hold tremendous promise for our global economy and for our individual businesses. A decade ago, we had no clue the impact that mobile technology would have on our lives and today it’s ubiquitous. Regardless of the upheaval brought about by new technology advances and the new challenges they can introduce, disruptive innovations are the key to a prosperous future. I for one am looking forward to what the future holds!
Source:
An effective BYOD plan must balance control with convenience. Here's what to keep in mind.
Managers often believe a bring-your-own-device (BYOD) strategy is a silver bullet to solving mobile communication problems within their organization. Thoughts of "I don't need to purchase hardware for employees" or "Workers are more productive with their own device" can mask the challenges that often accompany BYOD programs. Today's business environment is becoming a target for data breaches and various security risks, so organizations cannot afford to overlook security when developing a BYOD strategy.
However, there's a fine balance when implementing BYOD security regulations -- you don't want to be so overzealous about security that employees' work is hindered. Done right, BYOD can reduce technology expenses while increasing end users' productivity and improving office morale. An optimal enterprise mobility strategy provides comprehensive device security without impeding employees' pace of work.
For example, many companies have traditionally forced users to connect with a VPN before accessing company resources. On mobile devices, that process is a real pain. It's also not practical -- since most users switch between work and personal tasks, it actually discourages users from staying connected and productive. Companies can implement in-app VPNs and Micro VPNs, which automatically connect specific apps to the corporate network without requiring users to make that connection manually. Companies can also distribute secure browsers that allow users on to internal links that automatically connect to Intranet sites or web application servers without manually launching and connecting with a VPN.
Without a well-designed and unified device management strategy in place, companies risk exposing their most sensitive data to outside sources -- and even competitors -- while stunting employee innovation. Here are three ways to create a plan that maximizes the benefits of BYOD while mitigating the threats.
1. Be transparent with employees.
Attempting to hide unflattering aspects of a BYOD plan can backfire if employees discover them. Being truthful about employee privacy rights and enterprise mobility management (EMM) components fosters a sense of trust between decision makers and their corporate team. We see this often with companies we work with: They explain that the technology is designed to protect and secure, but that it may collect employees' personal location information and personal apps. Be clear that you're not trying to play Big Brother, and that there are privacy filters installed to restrict access to most personal identifiable information (PII).
Building BYOD trust works both ways. CIOs and company leaders should feel confident that their employees are responsibly embracing the freedom of enterprise mobility -- and if at any point the leadership team feels that workers are not handling company data securely, they have the option to implement stricter BYOD controls.









