Tuesday, September 18, 2007

Home Stretch – So what have we learned

Summary and Lessons Learned in CS 855 – Futuring and Innovation

This past eleven weeks has been fast and furious. From our July residency with Dr William Halal lecturing on futuring and the principles around his think tank “TechCast”, to our exploration of Web 2.0 and the tools that are now widely available, it has been a wild ride.

Web 2.0
One of the slickest Web 2.0 tools Animoto was posted by Ted in our class. It allows you to upload pictures and set them to music to create a video trailer. I have taken some pictures from our last Residency and created a short video.

http://animoto.com/play/3ca65bf4857a660a1464abf52f4e5309


Michelle was the first in our class to demonstrate Gabcast, with its easy of use to be able to record audio blogs just by using the common telephone. I have to admit it is sometimes easier to listen to a blog and it has a nice feature that allows you to download the file so you can listen to it while you are in the car.

We explored the many faces of Google Web 2.0 from Google Maps, Google Mail, Google Groups, Google Docs, iGoogle, and the list continues on. As you can see Google really dominates the Web 2.0. With their recent purchase of YouTube they are well positioned to lead the industry into Web 3.0 (the next Microsoft). The real concern with most of Google’s Web 2.0 tools is the storing of user data and how well users privacy will be protected. Pictures that I upload to share with my family I don’t necessarily want them cached on Google’s search engine. This should be a concern for most of the Web 2.0 services we reviewed in this class not just Google.

Second Life
The idea of a virtual multi dimension environment has endless possibilities. Starting out in 2003 it has grown to almost 10M residents. It now supports real time classrooms along with a complex economic structure. The value of being able to have virtual classrooms and not only interact with a Professor, but also with other students while separated by an infinite distance is the jewel of Web 2.0. The concept of Second Life has expanded the idea of collaboration with endless possibilities. The ability to use Second Live in a work environment to conduct virtual brainstorming sessions, where it is possible to have breakout groups, but still interact as a group. The ability to be able to bring virtual teams together that are physically located on different continents.

Predictions
After reading “The Fortune Sellers” it became very clear that predicting the future is based more on luck than on science. The world around us is in constant chaos making most social systems unpredictable. The stock market is a good example of such a system. Very few market analysts are able to predict the direction of the market with any degree of consistency [1]. Areas that can be founded on hard science can achieve a higher degree of accuracy if engineering principles are applied.

Failed Predictions
Often failed predictions can be attributed to changing conditions, lack of supporting data, the lag time from the prediction to the event being too great, or inexperience of the person making the prediction. Some social systems due to their complexity make any form of prediction difficult and with a high level of uncertainty.

My failed prediction was Ford Motor Company’s flying car in the 50’s. The reason the prediction failed was due to government regulations imposed by the FAA. The air traffic infrastructure would not have been able to handle the air traffic if flying cars had gone into mass production. Matt commented that the prediction may have not failed, but been delayed and that flying cars may still become a reality. I believe that the problems that prevented flying cars from becoming a reality in the 50’s are still present today. This is likely to be an inhibiting factor until air traffic management can be fully automated and the human element removed.

Successful Predictions
One of the most widely known predictions is Moore’s law, which states that circuit density on integrated circuits will double every 18 months. This prediction was made back in 1964 and has remained true for the past 40+ years. If we look at the circumstances around this prediction we will discover that this prediction was made quickly without much thought. Gordon Moore had been asked by an electronics magazine to predict semiconductor progress over the next 10 years. The present state of the industry was 30 circuits per chip and he knew the prototypes he presently had in the lab where at 60 circuits. Taking an educated guess he expected the density in 10 years to be 600,000 circuits, a thousand-fold increase which turned out to be very precise.[2] It is amazing that one of the most famous predictions of the 21st century and it could have been surmised on the back of a cocktail napkin.

Summary
This class on Socio Technical Futuring has been one of the most inspiring courses in the doctoral program. The art of prediction was truly fascinating. It revealed what most of us already expected, that in a lot of cases the expert is no better at making predictions than we are. Some social environments are too complex to predict with the tools [1] that are available. What we should recognize is that we can only successfully predict with any degree of certainty what we have under our control. The more uncertainty, the greater the margin of error we are likely to encounter.

Referances:
[1] William A. Sherden, “The Fortune Sellers: The Big Business of Buying and Selling Predictions” John Wiley & Sons, (1998)
[2] InfoWorld CTO Forum, San Francisco, (2002), http://www.intel.com/pressroom/archive/speeches/gelsinger20020409.htm

Mark's Top 5 Predictions

1. Prediction – In the near future all electronic communications will be computer to computer. The only difference will be if it is person verse a computer wanting to communicate.

2. Predictions – In the next 10 years the military will adopt computers that are integrated into their clothing and within the next 15 years we will see this technology become popular in the civilian sector.

3. Prediction – The Internet will extend itself through personal peer to peer networks to allow connectivity down to any person, anywhere, and any time. All communication networks that we know now, telephone, TV, and cell phones will become IP based.

4. Prediction – Computing will become ubiquitous through the use of personal computers we will wear. We will use this interface to connect to network based computers for additional process ability or information retrieval. This will become possible through use of encryption and advanced authentication methods.

5. Prediction – Within the next 75 years we will see advances in how we generate and transport electricity based partially on research done by Nikola Tesla and now being investigated by MIT.

Sunday, September 16, 2007

Analyzing Successful Predictions

We are consistently reading about predictions from weather forecasts, to the stock market, to the cost heating oil this winter. In every major field there is someone that is making a prediction and trying to convince us based on past historical trends, alignment of the planets, or that caterpillars have hairy coats this fall. Statistical analysis of these forecasts has shown at best having about 50 percent chance of being correct while others rank with winning the lottery.

Most mutual fund managers don’t perform any better or as well as the SP index [1]. In fact most correct predictions are correct only by luck or the original prediction was incorrectly stated in the beginning.[2]

The World Weather Organization reports that the global weather forecasting budget to be approximately $4B per year. Yet long range weather forecasts are at best 50% correct. What has been shown is as the lag time between the prediction and the event decreases the greater the accuracy of the prediction. [2]

So what makes one prediction better than the next? In what sectors do predictions tend to be forecasters of the future? This question is an age old question that has been pondered over the decades. As with any prediction it should be viewed with caution and judged on the merit the prediction is founded on. If prediction is based on the readings of sacrificed goat entails it should be viewed with much caution. If it is based on hard scientific research then maybe more thought should be given it. In “The Fortune Sellers” the author lists 5 questions that should be asked: [2]

1. Is the forecast based on hard science?
2. How sound are the methods?
3. Does the forecaster have credible credentials?
4. Does the forecaster have a proven track record?
5. To what extent is my belief in a particular forecast influenced by my personal beliefs and wishful thinking?

Even after reviewing these questions it is still important to ask yourself if current events have been factored into the prediction. They may be predicting a mild winter so the expected price of heating oil may be forecasted to be low, however if there is a reported oil refinery closure due to a fire, the end resulted may be high heating oil prices. The longer lag time on a predication, the high the probability that an external event may impact the prediction.

These questions should apply to any prediction, whether it appears in a local news paper or is part of a strategic report from Gartner that your company paid for. Be leery of predictions based on “group think”. Just because a group of analysts’ support a given predictions, doesn’t mean that it is correct. They may all read the same market research report. Look for supporting data from different research sources.

In an attempt to test this approach I have selected a couple of predictions that have been proven with time to see if this model is supported. The two predictions are Moore’s Law and Elaine Garzarelli prediction of the 1987 stock market crash.

If we look at the circumstances around Gordon Moore’s famous prediction of circuit density doubling every 18 months we will find an interesting story. He had been approached by electronics magazine to predict semiconductor progress over the next 10 years. The present state of the industry was 30 circuits per chip and he knew the prototypes he presently had in the lab where at 60 circuits. Taking an educated guess he expected the density in 10 years to be 600,000 circuits, a thousand-fold increase which turned out to very precise.[3] There appears to be good amount of luck in this forecast, but it was extrapolated from the best data available at that time. If we apply the 5 rules we previously discussed, steps 1 & 2 fail, step 3 is met, step 4 would probably fail (insufficient information), and step five would probably not be a weighting factor. So to summarize 1 pass, 3 fails, and 1 unknown, however the prediction was true and still holds true after 42 years. In the case of Gordon Moore’s prediction, he was an expert in his field, he did have credible data to extrapolate from, and he had a good share of luck.

For the next case to analyze I draw from a case study cited in “The Fortune Tellers”. Elaine Garzarelli had developed a model to predict the direction of the market as either being bullish or bearish. After tracking the stock market for 17 years her model was showing 92% bearish, the most negative it had ever indicated. She went on Cable News Network’s “Money Line” predicting an “imminent collapse of the market”. Four days later the DOW crashed. She was immediately declared a “Market Guru”. However, she was able to predict the market crash, but was not able to predict its recover. For the preceding 9 years after her successful prediction she was only able to correctly predict the direction of the market 5 times out of 13 calls. If we look at the steps we can say that the model had been developed and tweaked for 17 years. It was based on accepted scientific methods, but not necessarily proven theorems. So we would have to say that step 1 failed, steps 2 & 3 passed, step 4 is unknown, and step 5 would depend on your feeling about the health of the market. So again the 5 step evaluation model sends an unsure message if to accept or reject the prediction.

To summarize, the five step model failed to accept the two predictions even though the predictions where true. The first prediction was not only correct, but has been able to stand the test of time of the past 42 years. The second prediction was correct, but could not be sustained. If these are typical successful predictions, then the model should be amended to the following conditions*:

1. The forecasters must have credible credentials
2. They must have current data
3. The lag between the prediction and time to occur must be short
4. Is the forecast base on proven theorems
5. How sound are the methods
6. Does the forecaster have a proven track record

*Note: A larger sampling would be required to prove or disprove the model with any level of confidence.


Reference:

[1] Richard A. Ippolito, “Efficiency with Costly Informational A Study of Mutual Funds in the Period 145-1964,” Journal of Finance 23(2) (1968): 389.

[2] William A. Sherden, “The Fortune Sellers: The Big Business of Buying and Selling Predictions” John Wiley & Sons, (1998)

[3] InfoWorld CTO Forum, San Francisco, (2002), http://www.intel.com/pressroom/archive/speeches/gelsinger20020409.htm

Sunday, September 9, 2007

Web 2.0 Google Docs and Spreadsheets

Attached is the transcript of my recent gabcast on Google Docs and Spreadsheets.

http://www.gabcast.com/index.php?a=episodes&query=&b=play&id=13007&cast=39913&castPage=&autoplay=true


Enjoy,
Mark


Hello My name is Mark Gerschefske and I’m a doctoral student at Colorado Technical University. I’m presently studying Socio Technical Futuring and reviewing the innovations that have been made with Web 2.0.

I would like to welcome everyone to my gabcast on Google’s Docs and spreadsheet. This discussion will provide a brief over view of the collaborative environment for documents and spreadsheets as provided by Google

This gabcast will be structured into three parts…Part one will discuss Google Docs … Part two will talk to Google Spread sheets… the part three being a general summation of both Google tools.

GOOGLE DOCS

To start with I’ll discuss the ability of Docs …I imported some sample text then started to tested the features in the tool bar. All the features such as fonts, colored text worked well . Different style are supported along with comments. One nice feature was a revision tab which shows what changes have been and by who. The only negative comment that I would make was the editing features are limited to more common ones. Such advanced features such as search and replace are not supported. However it does support saving the text in Word, Open Office and standard formats such as PDF, RFT, HTML etc.

It allows you insert objects such as images, Links, comments, tables, bookmarks, and special characters. When you select special characters it brings up a table showing special characters, Asian character, wingdings, and advanced. The advanced tab allows for you to specify the Unicode and also preview it before inserting into the document. Very nice feature…

It also includes a HTML editor which makes it a nice round package for the right price…free

GOOGLE SHPREADSHEET

To change form Google docs to Google spreadsheets you must click on DOC Homes in the upper right hand corner of the screen. This takes you to your Google DOC home page that shows all your folders. From there you click on NEW in the tool bar and select spreadsheet from the drop down menu,

I started by copying a spread sheet and pasting it into Google. The first thing that I noticed was all the highlighting disappeared along with the charts. Next I changed some of the numerical fields to see if the formulas still worked. Nothing!! A little disappointing that the first two tasks failed….So I selected the Formulas Tab to see if the where copied over…nothing there. So that explained why it could recalculate the spreadsheet. Next test was to see if I could chart the data that was copied over. That seem to work fairly well except for it didn’t support logarithmic scaling, so if there is a wide delta between the data points it will appear off the scale. This could be considered an advance function so I can understand why it wasn’t supported.

Other features such as add another sheet or duplicate a work sheet preformed well. I decided to compose a simple spread sheet to see how easy it was to use. I enter the year 2000 and clicked on the lower right corner while press CRTL and dragged across the screen expecting to inclement by one. Instead I got 2003, 2002, 2004, 2001, and 2005. What I expected to see 2000 incremented by on ending in 2005. Maybe a new feature HMMM. When I tried the same task starting from 1 and incrementing by ten… I got 1,7,6,8,9,10,11,12,13,and 14. Once it got to 8 it worked as expected other than skipping 2-5. Well done Google


To summarize…Google Docs and Spreadsheets are not bad as entry level collaborator tools. Both of them suffered from a lack of features that we have grown to expect when compared to Microsoft Office. I understand this may be an unfair comparison, but it is the bench mark everyone is compared to. When considered that these are free tools that are Web based, they are not bad for a first start. I did appear to find a bug in Google’s spreadsheet in the way that it increments numbers. I would have expected them to have caught such a simple error in their testing prior to it being released. I guess that Microsoft is not the only company that has problems with quality control. Another distraction was the time lag due to it being Web based. I have a 1 ½ meg down Internet connection, but sometimes it took several seconds for the changes to appear on the screen. As more features are added I see Google Docs and Spreadsheets becoming popular with students and home users who don’t want to invest in expensive software.

This is Mark Gerschefske signing off and hopping that you found my gabcast on Google Docs and Spreadsheets informative. Thank You.

Monday, September 3, 2007

Web 2.0 – Second Generation Web

I have posted several discussions on what Web 2.0 is. Today I came across an article in the July August 2007 IEEE TI PRO, titled “Understanding Web 2.0” which I would like to share. In my previous posting I have discussed some of the directions of Web 2.0 and how services have evolved from what is called Web 1.0. As an example screen scraping technologies have evolved into Web services and how workers are driving the trend. Web 2.0 is the result of social trends that has enabled visionary entrepreneurs to launch such sites as YouTube, MySpace, and facebook to mention only a few. What I have failed to mention is what companies are doing to ensure that they are not left behind in this fast moving Web 2.0 wave.

In a recent survey conducted by McKinsey Quarterly, 80% of executives contacted said, that their company was investing in Web services. They are using Web 2.0 to bring technology into their business to provide better communications to their customers and business partners. To improve efficiency’s with external entities, companies are using peer to peer networking, blogs, and RSS feeds. Just as Web 2.0 is improving external communications, over half the companies surveyed stated that Web 2.0 tools were being used to improve the way internal knowledge is distributed and managed. Web 2.0 services are becoming strategic to companies on how they plan to grow their business. What is interesting a lot of these Web 2.0 initiatives are started by small groups or individuals as previously stated. Due to the easy of use these tools can be quickly prototyped to support customers and are ad hoc by nature. This allows companies to be very responsive to customers and their feedback.

Of the three technologies that companies were the most interested are, RSS feeds, blogs, and wikis. Although, these were not the most often Web 2.0 tools that have been deployed. This shows that companies are excited about collaborative communication technologies and see this direction has having the biggest impact on how they will do business in the future.

Mashups are aggregation of content from multiple sources on a single Web page or Web site. They are usually comprised of APIs that allow interconnection into wide array of programs. One example is the integration of online maps with business searches to show the user where a business is located. The use of aggregation adds value and improves the users Web experience.

The proliferation of Web 2.0 is causing companies to rethink how they need to business to remain competitive. The use of collaborative technologies promises to improve efficiency’s within companies and how the interact with their business partners. With the expanding of global markets the need to collaborate across virtual boundaries will be the key to success.

References:
Murugesan, San, IEEE IT Pro, Understanding Web 2.0, July/August 2007.

The McKinsey Quarterly, How businesses are using Web 2.0: A McKinsey Global Survey, March 2007 http://www.mckinseyquarterly.com/article_page.aspx?ar=1913&l2=13&l3=11&srid=9&gp=1

Mark

Wednesday, August 29, 2007

Web 2.0 Tools – Google Docs

This week I decided to try Google Docs for a collaboration project to write a Computer Security Policy for our church. There are three of us on the committee and it seemed like a good idea to jointly work on the documents together.

The first step was to upload the files to my Google Docs folder. This seemed like a simple enough task. It turns out that there where 11 files for this project with each file having to be uploaded separately. Not believing that Google would make you do this, I kept trying to select multiple files, but the interface to Google Docs would not allow it. So after uploading 12 files (lost track and ended uploading one file twice) I was able to copy them into a separate folder. Again, only one file at a time. At this point I was starting to get a little impatient, but I continued. The next task was to share these files. So logically I though that I could just share the entire folder. Again I was wrong, each file must be shared individually, a pattern was beginning to form. This required that I email an invitation to each member of the group for each file. I think that you are starting to get the picture, 11 files meant 11 email invites. After about the fourth invite I decided I better see how well the collaboration tools actual worked before I spammed everyone’s mailbox.

So the next task was to actually edit one of the documents. Since these where boiler plated security policy documents form the SANS Website, I need to do a global find and replace. I quickly learned that this not a supported feature. So after manually cutting and pasting I notice the font type and size where different. After spending 10 minutes trying to get the font right, I decided just to cut and past into MS Word. Even this operation was less than perfect resulting in some of the formatting being lost.

So after spending the best part of an hour, I have to conclude the experiment was less then a success. The concept is great, just that the implementation is less than perfect. It may work well to act as a central repository, but the features of the word processor are inadequate to allow productive work. I would have to give Google Docs a C-. One could say what do your expect for free.

Sunday, August 26, 2007

More on Web 2.0...

There has been much discussion on what is Web 2.0? After nearly 1 ½ years there is still substantial disagreement on what Web 2.0 really means. Nearly everyone will agree that Web 2.0 is social network sites, Wikipedia, Blogs, and RSS feeds. Some will even go as far to say that many of these services were included in Web 1.0 and that Web 2.0 is nothing but, a marketing play.

Tim O’Reilly on his Website in an attempt to clarify the meaning of Web 2.0 listed the following comparisons.

Web 1.0 Web 2.0
DoubleClick --> Google AdSense
Ofoto --> Flickr
Akamai --> BitTorrent
mp3.com --> Napster
Britannica Online --> Wikipedia
personal websites --> blogging
evite --> upcoming.org and EVDB
domain name speculation --> search engine optimization
page views --> cost per click
screen scraping --> web services
publishing --> participation
content management systems --> wikis
directories (taxonomy) --> tagging ("folksonomy")
stickiness --> syndication

When thinking of the progression from Web 1.0 to 2.0 one of the first examples is that of Netscape and Google. Netscape is thought to be the standard bearer of Web 1.0 in as much that its IPO was one of the defining events for that era. The same can also be said, for Google and Web 2.0. Netscape framed the Web 1.0 as being part of old software paradigm relying heavily on Web based browser which was packaged as a desktop application. Google on the other hand has always been a native Web application which delivered services, not packaged or sold. Google and other Web 2.0 applications often took a radical approach in their design. BitTorrent made every client a server. Wekipedia was based on the concept that any user could make an entry and that any entry could be edited by any user.

The power of Web 2.0 is the ability to extend useful information to a user from the collection of terabytes of available data. Google does this through its extensive search language which allows a user to navigate through all the data on the Internet to target a specific piece of information residing on an unknown server somewhere. This power has both a good and dark side which will have to be brought under control. This same technology can be used to find credit card numbers, personal SSNs, along with system administration passwords. A term has even been coined for this activity, Google Hacking.

To summarize some of the core competencies of Web 2.0 companies can be thought of as:
Services with ease of scalability
Services that gets richer as more people use them
Services based on collective intelligence

http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html

Mark

Wednesday, August 8, 2007

DEFCON 15 - DAY ONE

This past week I was in Las Vegas attending DEFCON 15. The next few Blogs will cover areas I saw of interest while I was there. The forum consisted of 5 tracks that ran from 10:00AM to 9:00PM in the evening. The amount of information that is provided is tremendous. For those that are not familiar with DEFCON it is titled as the “Largest Underground Hacking Convention” held at the Rivera Hotel in downtown Las Vegas. This year there were over 7000 hackers and security professionals attending the show. It is unique in that there is no pre-registration; in fact there is no registration at all. You show up in the hotel lobby with $100 cash and you exchange the money for a badge. No questions, no lists, and no record of who attended. Due to time I have I will only be able to discuss a few of the presentations that I attended.

A new contest has been added to the venue... spot the undercover press official (a take off of spot the Fed). The story goes that a female reporter was trying to interview unsuspecting young hackers for a pending story on hackers. She was confronted and requested to wear press credentials. After she refused, the question was posed to the attendees, what should we do with the undercover press official. After discounting the cries for lynching it was decided to ask her to leave the conference. The last that was seen of her was her running for the door (see the YouTube video below)

http://www.youtube.com/watch?v=2kbar4ahK7M

http://www.youtube.com/watch?v=3n2cBSBIAP0

For a more serious comment on day one I will talk about Bruce Schneier’s Q & A session. No speech from Bruce would be complete without his ragging on the TSA (not that they don’t deserve it). The point of his TSA talk was that it is possible to fly on commercial airlines without showing a government ID, regardless of what you have been told. To prove the point he traveled to DEFCON in Las Vegas without any identification.

He was asked about encryption algorithms and if he considered SHA1 still to be secure to use. What he said about encryption was that even weak algorithms are hard if not impossible to break. The problem is with the passwords or use of limited key space. He went on to say that the FBI doesn’t try and break encryption, they just install a key logger or send the hard drive to AccessData Corp and have them search for the password in some file or slack area of the drive. He believes that SHA1 still has about 5 years of use before it must be replaced. He though that there was plenty of time to develop the next generation of encryption.

Next he was asked about Data Privacy. His thoughts were, it was really a little too late to be worrying about it. We all have so much sensitive data in the public domain that we have lost control of it. The other problem is that we don’t own the data about us. When we go to Amazon dot com they collect information about what we buy, how we paid for it, and where we live. They are able to use this data, share it with there partners, or sell it. Until we get control of the data that is collected about us we will not be able to protect our data privacy. His believe is that Society as a whole will have to decide how it important this data is and force controls to be put into place to protect it. With Web 2.0 and social Web sites this problem will only become worse.

More breaking news on simple hacks for office doors…

Hacking a office door reader

http://www.youtube.com/watch?v=z7oPn7V5mHg

12 year old girl bumps a look at Defcon 15

http://www.youtube.com/watch?v=D1LH7lrftKA


Tuesday, August 7, 2007

Predicting a Digital Pearl Harbor

This spring we saw the first example of what an Informational Warfare Attack against a country might look like. Estonia came under a loosely organized attack from Russia Websites on April 27th. This action appeared to be in response to the removal of a Russian monument. The attacks appeared to be limited to internet infrastructure targets and simple in design. Most of the attacks where limited to denial of service (DoS) attacks with posting on Russian sites on how to craft DoS scripts and what IP addresses to use. It played on the Nationalism sprit of the average Russian to attack Estonia under the banner of attacking “Estonian Fascists”. I am not attempting to enter into a political discussion between Estonian and Russia, but list some of the facts that have been reported about the incident. Since Estonia was not attacked directly by another Government this incident is being referred to as a Cyber Riot.

The incident that we saw in Estonian demonstrated the effectiveness of Internet attacks against another country. This was one of first documented cases of an attack at this level. The trends show that a full scale Digital Pearl Harbor attack is not only possible, but it is a question of when. What was lacking in this attack was the sophistication required to attack the integrity of systems and make changes. This attack only was able to deny access to key systems.

A more recent large scale attack was launched against sites in Italy. The attacks where based on a Russian made exploit kit called Mpack which provides the attackers a management console to show which exploits are effective and which Domains are susceptible. While this attack was believed to be gang related it does demonstrate the level of sophistication that has been developed. If this type of tool was deployed in a botnet farm the results could be devastating. Recently “Operation Bot Roast” conducted by the FBI uncovered a botnet containing over a million computers. The Italian attack was only based on about 6000 compromised computers.

Although the ability has been demonstrated to launch a denial of service attack against another country it will be a number of years before a sophisticated attack can be coordinated to attack critical infrastructure sites to impact a country. To be able to target Wall Street and be able to corrupt the “Trading Platform” or target the control systems of nuclear power plants.

Based on past studies I predict that within the next 10 to 15 years we will see a full scale sophisticated attack against another country as an alternative to a conventional military action. The USAF has recently added Cyberspace to its mission and it is charge with developing surgical attack tools along with protecting US National Internet assists.


Links for additional information:

http://www.nytimes.com/2007/05/29/technology/29estonia.html?ex=1338091200&en=80d1358e48b6740b&ei=5088&partner=rssnyt&emc=rss

http://www.guardian.co.uk/technology/2003/feb/20/security.onlinesupplement

http://www.dickdestiny.com/blog/2007/06/electronic-pearl-harbor-for-day-idiot.html

http://www.computerworlduk.com/management/security/cybercrime/news/index.cfm?newsid=3561

http://www.computerworlduk.com/management/security/cybercrime/news/index.cfm?newsid=3514

http://www.msisac.org/webcast/05_05/info/05_05presentation.swf

Wednesday, August 1, 2007

Failed Prediction: The flying car:

One area that I have been asked to comment on, is why predictions fail. I recently came across an interesting page in Wikipedia that discussed Fords Flying Car. I thought this was an interesting example of where the technology was there, but other outside forces prevented the prediction from becoming a reality.

The Ford Motor Company in the 50’s performed a feasibility study and they conclude that both the technology and market supported a flying car. They saw interests from the public service sector such as emergency response, police, military, and luxury transportation. The concept may have been fine, but the supporting infrastructure wasn’t there. When Ford approached the FAA it was identified that the means for air traffic control was inadequate for the amount of flying cars that Ford was proposing.

In 1937 Waldo Waterman had produced the Watermans’ Aerobile powered by a Studebaker engine that could fly at a speed of 112 MPH and on the ground at 56 MHP. The technology existed, but the government wasn’t ready for every home to have a plane in their garage.

What has been developed in this market space is lightweight helicopters for niche markets. What was lost in the development was a low cost model that could be called the “peoples” Aerobile. It could be argued that the prediction was 70-80 years ahead of its time. Maybe in the next twenty years we will see an affordable flying car that is capable of both ground and in air transportation. Maybe when we master gravity we will see a host of not just flying cars, but flying personal transports.

http://en.wikipedia.org/wiki/Flying_car

Sunday, July 29, 2007

Second Life

We all know that Second life is now being used by University’s and different groups to provide an online training environment. But, I recently ran across an article that described how a MIT researcher was using a Wii controller within Second Life to create training simulations for power plants, pesticide applicators, and medical device manufactures. So how much longer will it before you logon to Second Life to go to the office instead of driving there in your car?

http://www.wired.com/gadgets/miscellaneous/news/2007/07/wiimote

Mark

Wednesday, July 25, 2007

Web 2.0

I was recently reading about Web 2.0 and its direction when I came across an article that listed 6 predictions. Reading their predictions what they were saying could be said about most new Web approaches.

· Open source will be a driving factor
· Worker will drive the trend within their companies
· If the industry doesn’t drive standards the enterprise will
· People who don’t get it will be left behind
· Web 2.0 is an attitude that will extend past this current trend
· Software will get simpler and more online

http://www.thealarmclock.com/euro/archives/2007/07/six_predictions_for.html

Saturday, July 21, 2007

Introduction

I am a Doctorial student doing research on IT security metrics. The main discussion thread for this Blog will deal with future trends and predictions of the Internet.