Dashboard, Konfabulator differences becoming clearer Mac pundit John Gruber has written an excellent overview of the current snafu concerning Apple's new Dashboard and Arlo Rose's Konfabulator...
« May 2004 | Main | July 2004 »
Dashboard, Konfabulator differences becoming clearer Mac pundit John Gruber has written an excellent overview of the current snafu concerning Apple's new Dashboard and Arlo Rose's Konfabulator...
04:52 PM | Permalink | Comments (0) | TrackBack (0)
This piece from SD Times that I simply do not agree with! Lead me to the question: Are you a "google" away from being "amazoned". Here is the excerpt in SD times that irked me so much: Eric Newcomer, CTO of Iona Technologies PLC, argues that avoiding vendor lock-in is not the most important role played by standards. "We hear a lot about the importance of standards. And the standards argument usually centers on guarding against vendor lock-in, since lock-in can be an expensive prospect. You will even find that most vendors readily acknowledge this benefit. While I do not dispute that avoiding vendor lock-in is of some importance, I do argue that of far more significance is the role industry standards play in reducing the overall cost of developing software and increasing developer productivity, especially for enterprise applications. What's needed is a common way of programming to any language or operating system, and a common way of communicating between any two or more programs. Heterogeneous hardware, operating- system and software environments are the main problems that businesses have, and will continue to have into the foreseeable future. The benefit of standards is to prevent Lock-in, this might be vendor or technology lock-in. There is a lot of hype around Real-Time Enterprise vision, and most technology vendors (OpenLink included) have realization of this vision as part of their value proposition. Any enterprise that is locked into a technology or vendor is simply abdicating a timeless responsibility to attain the enterprise agility levels espoused by the Real-Time Enterprise vision. The real cost of engaging any technology or vendor is all about the long term impact on the customers ability; the ability to respond to market inflections via existing and future IT infrastructure. A standards based IT infrastructure enables a company to dispose of those components that impede its ability to sustain desired agilitiy levels. Put differently, standards enable companies to assemble IT infrastructure from an increasingly heterogeneous pool of vendors. Thus, a company should be able to mix and match "best of class" IT infrastructure components in line with Enterprise Agility goals -something that is only attainable via a commitment to standards based infrastructure components in the first place. An enterprise cannot be locked into an database, operating system, programming language, or technolgy religion and expect to be agile. Failure to engage standards ultimately implies that you are a "google" away from being "amazoned" in your chosen market place.
03:59 PM | Permalink | Comments (0) | TrackBack (0)
Mac OS X Tiger is a copycat, claims developer While Apple is expecting Microsoft to mimic Mac OS X 10.4 Tiger, one developer says the new Mac operating system is the one doing the copying... [via MacMinute]
A more important aspect of this post is the changing face of the Web Client/ OS. What I mean by this is the gradual decomposition of the user agent monlith (browser for instance) into composite services that will increasing communicate with services over the net.
08:19 PM | Permalink | Comments (0) | TrackBack (0)
As Blogging Icons Mend Fences, GPLed Blog Tools Reap Buzz Blogging icons Dave Winer and Six Apart took steps late last week to defuse separate controversies illustrating how bloggers hooked on freebies can become a management challenge.
10:57 PM | Permalink | Comments (0) | TrackBack (0)
MOBILE USERS IN SINGAPORE MAKE NEW FRIENDS WITH BEDD
BEDD, a new communities application that is phone-to-phone not using any server and utilizing wireless Bluetooth technology, is literally bringing people together. âItâs very thrilling to see userâs reactions just after they receive the software and start filling out their own profiles, and having so many potential matches start pouring into their phones. Even more exciting is the rush of contacting the other person for the first time,â says Carlton.âThe BEDD software is fun for the end user because itâs carrying the most demanded content in the world - people content. Content about who you are and about what you want⦠content about you!â The fact that this is a revolutionary concept was further endorsed this week when Nokiaâs Series 60 Platform named BEDD âApplication of the Week.â
BEDD, as result of creating a new mobile social medium, also gives service providers and handset manufacturers something to look forward to: userâs fee revenue, network traffic revenue, increased handset sales revenue and ultimately the carrying, distribution and propagation of third party content - both mobile-to-mobile and fixed-point to mobile.
âThe possibilities of finding, communicating and interacting with other people, and finding products, have now moved from the traditional medium of newspapers and the Internet to the mobile phone,â says CTO Olle Bliding.
08:53 PM | Permalink | Comments (0) | TrackBack (0)
I was on a conference call with the Virtuoso product manager about the opportunities WinFS and RDF Storage provide regarding yet another attempt to get the broader industry / audience / potential customer base to simply "get it" re. the Virtuoso value proposition, and then this post by Jon Udell passed by.
I can only describe this as "Blog Telepathy" :-)
On the Google PC, you wouldn't need third-party add-ons to index and search your local files, e-mail, and instant messages. It would just happen. The voracious spider wouldn't stop there, though. The next piece of low-hanging fruit would be the Web pages you visit. These too would be stored, indexed, and made searchable. More ambitiously, the spider would record all your screen activity along with the underlying event streams. Even more ambitiously, it would record phone conversations, convert speech to text, and index that text. Although speech-to-text is a notoriously imperfect art, even imperfect results can support useful search. [InfoWorld.com]This column is a companion to another from a few weeks ago: Google's supercomputer. Meanwhile I've been working on a story about Longhorn, for which I had long and an extremely interesting interview with Quentin Clark,the architect ofdirector of program management for WinFS. I'd like to transcribe the whole thing to post along with the story, when it runs, but the upshot is that Microsoft is planning more and better integration between WinFS and XML -- both in terms of data definition and query -- than I'd previously heard, which is welcome news.It seems clear, though, that whatever can be accomplished by means of what I've come to call "managed metadata," we'll always want that Google effect to be happening in parallel. When asked about the Semantic Web and RDF at InfoWorld's 2002 CTO Forum, Sergey Brin said:
Look, putting angle brackets around things is not a technology, by itself. I'd rather make progress by having computers understand what humans write, than by forcing humans to write in ways computers can understand.From my perspective, this isn't an either/or choice. I'd rather make progress by having computers understand what people write and by helping people to write in ways that computers can understand. What's more, I'd like to construe "writing in ways that computers can understand" as a problem for which hybrid SQL/XML technology is a solution. When managed metadata exists, or can be acquired, purely relational query will be powerful. When metadata is implicitly present, for example in XML fragments, XPath and XQuery can leverage it. The combination of relational, XML, and free-text search is the best of all worlds. As I've mentioned before, by the way, Kingsley Idehen has been demonstrating this for several years.
04:09 PM | Permalink | Comments (0) | TrackBack (0)
By Kendall Grant Clark, XML.com I now find myself with a wealth of choices, not only on OS X -- where I use a freewheeling mixture of Mozilla, Safari, and Firefox daily -- but also on Linux. Windows users who continue to use Internet Explorer fall into one of several camps: they either don't know or don't care about web standards and compliance thereto; or they can't tell the difference between a bloated piece of software and a quality piece of software; or they can tell, but that difference is of no importance to them. If they don't know or care about IE's many defects, why should I? Because there are so many of them! In the ideal world technology standards would unify, rationalize, and perhaps create new markets. While this kind of positive benefit is sometimes achieved by bodies like ISO, W3C, and OASIS, in the real world, given Microsoft's total domination of desktop computing, that its browser does not excel ends up being not only a pain for users but also a pain for developers and publishers.
http://www.xml.com/pub/a/2004/06/16/deviant.html
06:04 PM | Permalink | Comments (0) | TrackBack (0)
Some clarifications re. my comments in the recent interview with LinuxInsider:
Will .Net Developers Get Mono? Novell has released a new version of Mono -- an open-source implementation of Microsoft's .Net framework -- and some early adopters are already singing its praises. "Mono makes Novell extremely relevant now," said Kingsley Idehen, president and CEO of OpenLink, which has just released Virtuoso 3.5, a database-oriented middleware product that was built using Mono.
I meant making Novell relevant again to developers (I also believe Mono could eventually have the same positive effect on IBM, Apple, and Borland).
"Using Mono, we can take Web services developed for Windows and .Net, and pop them into Solaris and Linux," said Idehen.
Web Services interoperability across .NET or Java, or across Solaris, Windows, Linux, doesn't require Mono (that is a feature of the Web Services paradigm).
What I was trying to communicate in the quote is that Mono now has an improved implementation of the .NET Web Services frameworks. The implementation is so close to that of .NET that you can take your .NET Web Services assembly (typically generated via Visual Studio.NET) and in the case of Virtuoso, drop it into a WebDAV folder, create a Virtual Directory, and then Virtuoso's Mono runtime hosting kicks in as demonstrated here .
03:29 PM | Permalink | Comments (0) | TrackBack (0)
A refreshing response from Dare, to an insightful post.
Jon Udell has started a series of blog posts about the pillars of Longhorn. So far he has written Questions about Longhorn, part 1: WinFS and Questions about Longhorn, part 2: WinFS and semantics which ask the key question "If the software industry and significant parts of Microsoft such as Office and Indigo have decided on XML as the data interchange format, why is the next generation file system for Windows basically an object oriented database instead of an XML-centric database?"
I'd be very interested in what the WinFS folks like Mike Deem would say in response to Jon if they read his blog. Personally, I worry less about how well WinFS supports XML and more about whether it will be fast, secure and failure resistant. After all, at worst WinFS will support XML as well as a regular file system does today which is good enough for me to locate and query documents with my favorite XML query language today. On the other hand, if WinFS doesn't perform well or shows the same good-idea-but-poorly-implemented nature of the Windows registry then it'll be a non-starter or much worse a widely used but often cursed aspect of Windows development (just like the Windows registry).
As Jon Udell points out the core scenarios touted for the encouraging the creation of WinFS (i.e search and adding metadata to files) don't really need a solution as complex or as intrusive to the operating system as WinFS. The only justification for something as radical and complex as WinFS is if Windows application developers end up utilizing it to meet their needs. However as an application developer on the Windows platform I primarily worry about three major aspects of WinFS. The first is performance, I definitely think having a query language over an optimized store in the file system is all good but wouldn't use it if the performance wasn't up to snuff. Secondly I worry about security, Longhorn evangelists like talking up what a wonderful world it would be if all my apps could share their data but ignore the fact that in reality this can lead to disasters. Having multiple applications share the same data store where one badly written application can corrupt the entire store is worrisome. This is the fundamental problem with the Windows registry and to a lesser extent the cause of DLL hell in Windows. The third thing I worry about is that the programming model will suck. An easy to use programming model often trumps almost any problem. Developers prefer building distributed applications using XML Web Services in .NET to the alternatives even though in some cases this choice leads to lower performance. The same developers would rather store information in the registry than come up with a robust alternative on their own because the programming model for the registry is fairly straightforward.
All things said, I think WinFS is an interesting idea. I'm still not sure it is a good idea but it is definitely interesting. Then again given that WinFS assimilated and thus delayed a very good idea from shipping, I may just be a biased SOB.
PS: I just saw that Jeremy Mazner posted a followup to Jon Udell's post entitled Jon Udell questions the value and direction of WinFS where he wrote
XML formats with well-defined, licensed schemas, are certainly a great step towards a world of open data interchange. But XML files alone don't make it easier for users to find, relate and act on their information. Jon's contention is that full text search over XML files is good enough, but is it really? I did a series of blog entries on WinFS scenarios back in February, and I don't think's Jon full text search approach would really enable these things.
Jeremy mostly misses Jon's point which is aptly reduced to a single question at the beginning of this post. Jon isn't comparing full text search over random XML files on your file system to WinFS. He is asking why couldn't WinFS be based on XML instead of being an object oriented database.
05:56 PM | Permalink | Comments (0) | TrackBack (0)
A Blog post for the ages, from Jon Udell. I expect to refer back to this post a number of times in the future, as I have the same concerns across related realms; for instance data access API usage and evolution.
Enjoy!
Questions about Longhorn, part 3: Avalon's enterprise mission
The slide shown at the right comes from a presentation entitled Windows client roadmap, given last month to the International .NET Association (INETA). When I see slides like this, I always want to change the word "How" to "Why" -- so, in this case, the question would become "Why do I have to pick between Windows Forms and Avalon?" Similarly, MSDN's Channel 9 ran a video clip of Joe Beda, from the Avalon team, entitled How should developers prepare for Longhorn/Avalon? that, at least for me, begs the question "Why should developers prepare for Longhorn/Avalon?"
I've been looking at decision trees like the one shown in this slide for more than a decade. It's always the same yellow-on-blue PowerPoint template, and always the same message: here's how to manage your investment in current Windows technologies while preparing to assimilate the new stuff. For platform junkies, the internal logic can be compelling. The INETA presentation shows, for example, how it'll be possible to use XAML to write WinForms apps that host combinations of WinForms and Avalon components, or to write Avalon apps that host either or both style of component. Cool! But...huh? Listen to how Joe Beda frames the "rich vs. reach" debate:
Avalon will be supplanting WinForms, but WinForms is more reach than it is rich. It's the reach versus rich thing, and in some ways there's a spectrum. If you write an ASP.NET thing and deploy via the browser, that's really reach. If you write a WinForms app, you can go down to Win98, I believe. Avalon's going to be Longhorn only.So developers are invited to classify degrees of reach -- not only with respect to the Web, but even within Windows -- and to code accordingly. What's more, they're invited to consider WinForms, the post-MFC (Microsoft Foundation Classes) GUI framework in the .NET Framework, as "reachier" than Avalon. That's true by definition since Avalon's not here yet, but bizarre given that mainstream Windows developers can't yet regard .NET as a ubiquitous foundation, even though many would like to.
Beda recommends that developers isolate business logic and data-intensive stuff from the visual stuff -- which is always smart, of course -- and goes on to sketch an incremental plan for retrofitting Avalon goodness into existing apps. He concludes:
Avalon, and Longhorn in general, is Microsoft's stake in the ground, saying that we believe power on your desktop, locally sitting there doing cool stuff, is here to stay. We're investing on the desktop, we think it's a good place to be, and we hope we're going to start a wave of excitement leveraging all these new technologies that we're building.It's not every decade that the Windows presentation subsystem gets a complete overhaul. As a matter of fact, it's never happened before. Avalon will retire the hodge-podge of DLLs that began with 16-bit Windows, and were carried forward (with accretion) to XP and Server 2003. It will replace this whole edifice with a new one that aims to unify three formerly distinct modes: the document, the user interface, and audio-visual media. This is a great idea, and it's a big deal. If you're a developer writing a Windows application that needs to deliver maximum consumer appeal three or four years from now, this is a wave you won't want to miss. But if you're an enterprise that will have to buy or build such applications, deploy them, and manage them, you'll want to know things like:
How much fragmentation can my developers and users tolerate within the Windows platform, never mind across platforms?
Will I be able to remote the Avalon GUI using Terminal Services and Citrix?
Is there any way to invest in Avalon without stealing resources from the Web and mobile stuff that I still have to support?
Then again, why even bother to ask these questions? It's not enough to believe that the return of rich-client technology will deliver compelling business benefits. (Which, by the way, I think it will.) You'd also have to be shown that Microsoft's brand of rich-client technology will trump all the platform-neutral variations. Perhaps such a case can be made, but the concept demos shown so far don't do so convincingly. The Amazon demo at the Longhorn PDC (Professional Developers Conference) was indeed cool, but you can see similar stuff happening in Laszlo, Flex, and other RIA (rich Internet application) environments today. Not, admittedly, with the same 3D effects. But if enterprises are going to head down a path that entails more Windows lock-in, Microsoft will have to combat the perception that the 3D stuff is gratuitous eye candy, and show order-of-magnitude improvements in users' ability to absorb and interact with information-rich services.
05:48 PM | Permalink | Comments (0) | TrackBack (0)
Recent Comments