Microsoft CEO Steve Ballmer has been quoted as saying there will be, "no media consumption left in 10 years that is not delivered over an IP network." Ballmer continued to say there “will be no newspapers, no magazines that are delivered in paper form. Everything gets delivered in an electronic form.”
To this I say, duh!!!
Somebody sign me up for CEO out west and I'll shout from the mountain top obvious statements. I don't mean to be over the top here, but for many people including myself, this all digital media concept is already a reality.
Every morning, I wake up and turn on the TV (delivered via an IP network), eat breakfast, get dressed and check some news on my PDA. After that it's off to the office where a flurry of digital media is accessed via the desktop. I can browse any newspaper online, read RSS feeds from hundreds of different websites, and even watch a streaming feed of CNBC.
The fact of the matter is I already get every piece of news and media digitally, I can't even fold a newspaper very well.
I know there are many out there that don't realize this, but reality is that technology advances exponentially and in the past 5 years the proliferation of the mobile web and streaming media has been enormous. With wireless networks beefing up and PDA's, smartphones, and UMPC's getting faster and more accessible to the everyday user, this all digital future will become a reality before we know it.
If you would like to learn more about Accent or our products and services, click here to visit our website (www.accentservices.com) or send us an e-mail at acsoffice@accentservices.com.

Friday, June 6, 2008
Microsoft CEO: "No print media in 10 years". I say it may be sooner than that!!
Tuesday, June 3, 2008
Is LTE the next must-have mobile broadband technology?
Credit: Networkworld.com
Long Term Evolution (LTE)-based services are garnering a lot of attention in the mobile broadband industry, despite the fact that they are at least two years away from being deployed.
LTE, considered by many analysts to be the next big wave in 4G wireless technology, is due to be launched commercially in 2010 by Verizon and AT&T, roughly two years after the Clearwire coalition’s big commercial WiMAX launch slated for later this year.
Technically speaking, LTE is a modulation technique that is the latest variation of Global Systems for Mobile Communications (GSM) technology. Its developers at the 3rd Generation Partnership Project (3GPP) dubbed it “Long Term Evolution” because they view it as the natural progression of High-Speed Packet Access (HSPA), the GSM technology that is currently used by carriers such as AT&T to deliver 3G mobile broadband.
GSM is by far the dominant mobile standard worldwide, with more the 2 billion global customers. In the United States, however, the only carriers that currently use GSM are AT&T and T-Mobile. Carriers Verizon and Sprint both use the rival Code Division for Multiple Access (CDMA) technology, although Verizon is due to move over to the GSM side when it launches its own LTE network sometime in 2010.
While it is far too early to predict how successful LTE will be in the enterprise market, recent trends indicate that demand for the technology could get a significant boost as businesses demand ever-faster mobile broadband access. For instance, a recent survey conducted by market research firm Chadwick Martin Bailey reports that nearly half of all enterprises currently use 3G cellular services, and that more than one-third plan on using WiMAX technology within the next year.
The major reasons for deploying mobile enterprise applications, the survey finds, include increased employee productivity and increased employee availability, as more than 80% of corporate users list both of them as key reasons for using more mobile technologies. If demand for increased mobile broadband speeds continues to be strong, LTE could be in a good position to compete with WiMAX as a widely deployed mobile broadband standard when it comes to market in 2010.
Check out the whole article here.
Mozilla shoots for geek world record
Credit: Networkworld.com
Mozilla is aiming to create what may be the geekiest world record ever with its upcoming Firefox 3 browser release.
The company on Wednesday started a campaign asking users to pledge to download the next full release of its browser on the day it is available so the release can set a Guinness World Record for the largest number of software downloads in 24 hours.
Mozilla has not yet unveiled exactly when Firefox 3 will be available, but expects it could be as soon as mid-June. A test release of Firefox 3 is currently available online.
The company is deeming the day of its release "Download Day" and is asking fans to not only pledge to download Firefox 3, but to host parties to encourage friends to download with them, and place "Download Day" buttons on their Web sites as reminders of the big day.
Currently there is no world record for software downloads; Mozilla is trying to create one with Firefox 3 and its Download Day festivities.
According to the campaign's Web site, once Download Day is over, Mozilla plans to provide the Guinness Book of World Records a signed statement of authentication from its judges showing that it followed rules for breaking records; the company also will confirm download numbers. Mozilla also plans to send video footage and photographs of Mozilla users hosting download parties as well as download logs for a sample size of Firefox 3 downloads to prove it has set a world record.
While the fanfare may seem a bit geeky, Firefox -- released in November 2004 -- has inspired a significant and rather fervent fan base. This is in part because it was the first browser in years to give Microsoft's Internet Explorer viable competition. The browser even has its own fan page (sign-in required) on the Facebook social-networking site, with 79,174 fans signed up and counting.
According to Mozilla, there are more than 175 million users of Firefox, which is available in more than 45 languages and used in more than 230 countries.
More information about how users can participate in Download Day is available on the campaign's Web site.
Tuesday, May 27, 2008
MIT researchers: morphing Web sites could bring riches
This is pretty cool and a little big brotherish as well. Network World originally posted this article a couple days ago, but it caught my eye and looks to be an interesting concept. Check it out and send us your thoughts.
Credit: NetworkWorld.com
Web sites that automatically customize themselves for each visitor so they come across as more appealing or simply less annoying can boost sales for online businesses by close to 20%, MIT research says.
These sites adapt to display information so everyone who visits sees a version best suited to their preferred style of absorbing information, say the four researchers who write about such sites in "Website Morphing", a paper being published this month in Marketing Science .
So the site might play an audio file and present graphics to one visitor, but present the same information as text to the next depending on each person's cognitive style. Morphing sites deduce that style from the decisions visitors make as they click through pages on the site.
"You need five to 10 clicks before you can really get a pretty good idea of who they are," says John Hauser, the lead author of the paper and a professor at MIT's Sloan School of Management. He says over the past decade statistics have evolved to allow broader conclusions from less data.
"You can infer a lot more from a lot less data by borrowing data from other respondents," he says. "When I first heard it I thought this couldn't possibly work."
But it does. By using a sample set of users navigating a test Web site, individual businesses can set the baseline for what click choices on that site mean about the visitor. Over time with real potential customers visiting a live site, the morphing engine fine tunes itself to draw better conclusions about visitors' preferences and to serve up what pages most likely lead to a sale, Hauser says.
The software is open source and available at MIT's Web site, but so far no one has created a commercial business to apply it to individual customers, he says.
Such auto-customizing Web sites are less intrusive than the alternative - sites that visitors can manually customize, a time-consuming process that many visitors won't bother with, the researchers say. And they create the right Web site for maximum sales much quicker, Hauser says.
Check out the rest of the article here.
Tuesday, May 13, 2008
Japanese Internet satellite hits 1.2Gbps
Credit: NetworkWorld.com
Engineers testing a recently launched Japanese data communications satellite have succeeded in establishing a two-way Internet link running at 1.2G bps (bits per second) each way, they said Monday.
The speed represents a record for satellite communications, according to the Japan Aerospace Exploration Agency and the National Institute of Information and Communications Technology.
The tests were carried out on May 2 as part of verification of the Kizuna satellite. In the tests data was transmitted on two 622M-bps channels both up to the satellite and down to a receiving antenna. Together the combined data transmission speed was 1.2G bps.
Kizuna was launched on Feb. 23 and is intended to provide high-speed Internet links to homes and offices in remote areas, to organizations as a back-up during natural disasters and to improve regional communications links in Asia.
One of the satellite's special features is an on-board Asynchronous Transfer Mode (ATM) switch. In other satellite Internet systems data sent to the satellite has to be sent to an earth station, demodulated, switched to its destination and then remodulated and sent again via the satellite to reach its destination. With a switch onboard the satellite is capable of doing all this in space thus making more efficient use of the available frequency spectrum, according to JAXA.
Tests carried out in March and April verified uplink communications at 1.5M bps and downlink at 155M bps using a compact 45-centimeter antenna and both up and downstream at 155M bps using a larger 1.2-meter dish. At the time JAXA said the 155M-bps downlink was the fastest in the world achieved with such a small-size antenna.
Thursday, May 8, 2008
Can you define cloud computing????
Cloud computing is all the rage these days, you've even seen articles about it on this site. But can you provide the true definition of cloud computing? Can anybody!?!?
Well, Joyent hosting got together with some of the big boys in technology and tried to define cloud computing. Interviewees include Tim O’Reilly, Dan Farber, Rafe Needleman, Brian Solis, and Stowe Boyd.
Check out the video below and let us know what you think of the cloud computing phenomenon.
IT departments must prepare for $200 a barrell oil and rising demand for teleworkers
Here's an interesting article from TechRepublic blogger Bill Detwiler regarding the price of oil effecting demand for teleworking and the impact on corporate networks. As oil prices rise, businesses will take a harder look at allowing employees to work from home instead of incurring the cost of commuting on a daily basis.
Many organizations provide automobiles and pay for the cost of gas for some of their employees, which can be a large financial burden as oil prices rise. Accent has a large fleet of service vehicles and vehicles for certain office personnel that accumulate a significant fuel bill each month. Personally, my business travels amount to $100 - $125 per week in fuel. Multiply that by 20 and one can understand the reason for increased demand in teleworking
Check out the article and link to the site below.
Credit: TechRepublic.com
Admins, start your VPNs! As oil and gas prices soar, IT organizations should prepare to support more remote workers.On Tuesday, Goldman Sachs analyst Arjun N. Murti predicted that oil prices may hit $150 or even $200 a barrel in the next six months to two years. Murti believes this “super-spike” will be driven by a lack of adequate growth in supply and could lead to demand rationing in developed nations (particularly the United States). Whether Murti’s prediction comes to fruition or not, fuel prices and transportation costs are likely to continue their steep rise for the foreseeable future–barring the unlikely discovery of new, easily-accessible oil reserves or the rapid development of alternative energy sources.
As transportation costs rise, organizations and workers will look for ways to reduce travel. For many employees, this will mean working from home to eliminate the daily commute. As I wrote in response to IBM’s prediction that the “virtual workplace will become the rule”, I’m not convinced the traditional office workplace is in immediate peril, but I believe a hybrid model will emerge. Employees will work from home a few days each week.
Today’s lesson: Start preparing now
Many IT organizations, particularly in large enterprises, already support a distributed workforce. IT leaders within this category should ensure their infrastructure has the capacity to support increased demand. IT departments not currently supporting remote users should begin exploring their options now. At the very least, you should make certain your network can support existing remote workplace technologies. Also, IT will not be immune from this trend. IT leaders must develop the skills and techniques required to manage a distributed workforce.
Here are resources that can help you support and manage remote workers:
- TechRepublic’s VPN Policy
- TechRepublic’s Remote Access Policy
- Learn the basics of virtual private networks (VPNs)
- 10 tools to help your remote workers stay in touch
- Fixed Mobile Convergence can centralize business numbers and reduce airtime
- Use special project management techniques for dispersed teams
- Unified communications: What it means to your business
- Why unified communications bring out the best in VoIP
- Unified communications terminology cheat sheet
- Presence: What is it, and why do you need it?
- Managing mobile devices the Microsoft way
- 10 things you should do before letting users take their laptops out the door
- Mobile devices are the new network perimeter: Can they be secured?
- It’s 9:00am: Do you know where your people are?
Friday, May 2, 2008
IBM, Google stirring up a cloud environment
Credit: NetworkWorld.com
Google and IBM are testing a cloud computing infrastructure that could become an important avenue for them to deliver software and services to consumer and business users.
Google's Eric Schmidt and IBM's Sam Palmisano addressed a gathering of IBM business partners in Los Angeles on Thursday and revealed the two companies have developed a cloud computing environment that runs on Linux and includes Xen virtualization and an Apache implementation of the Google File System called Hadoop.
Last year, the two companies teamed up on a parallel-computing initiative.
The IBM/Google cloud environment is being tested at the Massachusetts Institute of Technology, Stanford University and Carnegie Mellon University.
The pair isn't the first to test the cloud concept.Amazon is offering a cloud environment called EC2, which has companies or developers paying only for the capacity they need to run their applications or services that they in turn are offering to users or business partners.
While the two CEOs did not announce any future plans, they said the IBM/Google cloud would eventually be used to support an array of services and applications tailored for consumers and businesses.
Google already has a number of online services for consumers including calendars, photos and Google Docs word processing tools. Google also is positioning Google Docs as a suite of applications for corporate users, including offline capabilities.
IBM recently released IBM Symphony, a set of similar office productivity tools that users can download and run locally and can be tied into other services. It also has a set of social-networking tools that could be offered as a service.
The IBM/Google cloud initiative would compete with Microsoft's software-plus-services strategy and the recently announced Live Mesh, a storage and synchronization framework. Microsoft is attempting to tie the cloud to desktop and devices, and its on-premise business and server applications. In April, the company announced it had begun a beta to test a combination of Office and online services under the codename Albany.
Schmidt was quoted by the Dow Jones news service as saying Google's relationship with IBM is a "key plank" in its strategy "otherwise we can't reach the customers."
Thursday, May 1, 2008
HP Breakthrough Could Spawn Computers That Don't Forget
Another sign of the apocalypse, it looks like HP has taken another step toward a scene out of The Matrix. The below article is from TechNewsWorld.com and it details HP's discovery of a potential 4th fundamental circuit element that has the ability to remember information.
For those of you that remember electrical engineering from your college years, you know that there are three fundamental circuit elements (resistor, capacitor, inductor). HP's development of the "memristor" is supposedly the first circuit element that can remember information.
Check out a sample of the article below and give us your thoughts.
Credit: TechNewsWorld.com
Researchers at HP Labs have proven the existence of the "memristor," a component of electrical circuits that could lead to computer systems with memories that never forget, the company announced Wednesday.
The memristor -- short for "memory resistor" -- was previously only theorized to be the fourth fundamental circuit element in electrical engineering. In the April 30 edition of the journal Nature, however, HP (NYSE: HPQ) researchers presented both a mathematical model and a physical example of one.
Consuming far less power than current systems, computers based on the memristor would not need to be booted up and could associate information in a way much the way the human brain does.
'Significant Implications'
"To find something new and yet so fundamental in the mature field of electrical engineering is a big surprise, and one that has significant implications for the future of computer science," said R. Stanley Williams, the lead researcher on the work from HP Labs' Information and Quantum Systems Lab.
"By providing a mathematical model for the physics of a memristor, HP Labs has made it possible for engineers to develop integrated-circuit designs that could dramatically improve the performance and energy efficiency of PCs and data centers," Williams added.
Decades-Old Theory
The existence of the memristor was actually first proposed back in the early 1970s by Leon Chua, a distinguished faculty member in the University of California at Berkeley's electrical engineering and computer sciences department.
Chua argued that the memristor joined the resistor, capacitor and inductor as the fourth fundamental circuit element, and that it had properties that could not be duplicated by any combination of the other three elements.
One application for HP's new research could be the development of a new kind of computer memory that would supplement and eventually replace today's commonly used dynamic random access memory (DRAM), HP said.
Computers that use DRAM can't retain information when they lose power, and must go through a slow, energy-consuming boot-up process to retrieve data from a magnetic disk required to run the system. Memristor-based computers, on the other hand, would be able to "remember" their information after losing power and would save both power and time by not requiring the boot-up process.
Cloud Potential
Such capabilities could be particularly significant as cloud computing becomes more prevalent, HP noted, since the memory and storage systems used by current cloud infrastructures require significant power to store, retrieve and protect the information of millions of Web users worldwide.
"I'm terrifically interested -- this is a very exciting piece of news," Susan Eustis, president of WinterGreen Research, told TechNewsWorld. "I think it's very clear the reason cloud computing works so well is because it works using memory."
In Google (Nasdaq: GOOG) searches, for example, "responses come up really fast, and that has to be because they're coming out of memory," Eustis explained. "It's not going out to a database, accessing the storage and then coming back."
Memristor technology "would be of enormous use, but not just for cloud computing," she added. "It will impact all of computing."
Tuesday, April 29, 2008
Home networking forum developed to bring compatibility to devices
Credit: TMCnet.com
Infineon Technologies, Intel Corporation, Panasonic and Texas Instruments are going to try and create a single, global standard for connecting virtually all in-home devices handling digital content, such as movies, music and pictures, using home wiring
The forum is a companion to the International Telecommunication Union's ITU-T G.hn working group, which is working to create a single MAC and PHY protocol for transporting multimedia across a home's existing wiring to include coaxial cable, power lines and phone lines. From a chip level perspective, the advantages of a single global standard are obvious: it provides the scale required to create a big business.
"The forum will promote HomeGrid-certified products and ensure interoperability," says Matthew Theall, president of HomeGrid Forum. HomeGrid Forum has 11 founding members including Infineon, Intel, Panasonic and Texas Instruments, Aware Inc., DS2, Gigle Semiconductor and Pulse~LINK, Ikanos Communications, Inc., Sigma Designs and Westell.
Thursday, April 17, 2008
IPv6 exec says sound the alarm
Credit: NetworkWorld.com
The sky is falling on the number of global IP addresses, and IPv6 is the solution, executives from major technology companies said Wednesday.
The exhaustion of available IP addresses using IPv4 brought out the alarmist side of many industry executives. "It's a crisis -- not a market-oriented event," said Akinori Maemora, chairman of APNIC (Asia Pacific Network Information Centre), speaking at the Global IPv6 Summit in Beijing. "We have just three years until IPv4 addresses are depleted. These changes will come suddenly," he said.
The telecom industry is going through "a period of grief" over the end of IPv4, said Tony Hain, IPv6 technical leader for Cisco. "Most people in the world are still in a state of denial" about upgrading to IPv6. "No one will ask for IPv6 until they run out of IPv4 addresses," he said.IP addresses allow individual devices, including computers, laptops and mobile handsets to connect to the Internet. Using the current IPv4 system, which offers a total of about 4.7 billion possible IP addresses, some countries, including China, will begin to run out of addresses they can allocate around 2010, according to estimates by the Internet Assigned Numbers Authority and the Internet Corporation for Assigned Names and Numbers.
By switching to IPv6, the number of possible addresses increases by billions more. This would also allow a far greater number of devices to connect, allowing features such as the Internet-based remote control of security cameras, and even turning on home appliances from one's desktop at work.
You can see the whole article on Network World's website.
Tuesday, April 8, 2008
'The Grid' Could Soon Make the Internet Obsolete
Credit: FoxNews.com The Internet could soon be made obsolete. The scientists who pioneered it have now built a lightning-fast replacement capable of downloading entire feature films within seconds. At speeds about 10,000 times faster than a typical broadband connection, “the grid” will be able to send the entire Rolling Stones back catalogue from Britain to Japan in less than two seconds. The latest spin-off from Cern, the particle physics centre that created the web, the grid could also provide the kind of power needed to transmit holographic images; allow instant online gaming with hundreds of thousands of players; and offer high-definition video telephony for the price of a local call. David Britton, professor of physics at Glasgow University and a leading figure in the grid project, believes grid technologies could “revolutionise” society. “With this kind of computing power, future generations will have the ability to collaborate and communicate in ways older people like me cannot even imagine,” he said. The power of the grid will become apparent this summer after what scientists at Cern have termed their “red button” day - the switching-on of the Large Hadron Collider (LHC), the new particle accelerator built to probe the origin of the universe. The grid will be activated at the same time to capture the data it generates. Cern, based near Geneva, started the grid computing project seven years ago when researchers realised the LHC would generate annual data equivalent to 56m CDs - enough to make a stack 40 miles high. This meant that scientists at Cern - where Sir Tim Berners-Lee invented the web in 1989 - would no longer be able to use his creation for fear of causing a global collapse. This is because the Internet has evolved by linking together a hotchpotch of cables and routing equipment, much of which was originally designed for telephone calls and therefore lacks the capacity for high-speed data transmission. By contrast, the grid has been built with dedicated fibre optic cables and modern routing centres, meaning there are no outdated components to slow the deluge of data. The 55,000 servers already installed are expected to rise to 200,000 within the next two years. Professor Tony Doyle, technical director of the grid project, said: “We need so much processing power, there would even be an issue about getting enough electricity to run the computers if they were all at Cern. The only answer was a new network powerful enough to send the data instantly to research centres in other countries.” That network, in effect a parallel Internet, is now built, using fibre optic cables that run from Cern to 11 centres in the United States, Canada, the Far East, Europe and around the world. See the whole article here.