domingo, 22 de mayo de 2022

miguel de icaza recopilacion

 SIEMPRE FUI FAN DE mIGUEL DE iCAZA, NUNCA LO CONOCI EN PERSONA, pero pues yo estudie en la fac de ciencias la misma carrera que el, asi que siempre me entere de personas que trabajaron con el, que eran sus amigos, etc. etc.


esta es una recopilacion de articulos sobre el que he hecho desde hace tiempo:


Bio
Miguel de Icaza, co-founder and CTO of Ximian, is one of the foremost luminaries in the Linux development community and one of the world's most prominent advocates for Free Software. Before co-founding Ximian, Miguel was instrumental in the development of Linux for SPARC systems and the Midnight Commander file manager. Miguel is the founder and President of the GNOME Foundation. He was also the first recipient of the prestigious MIT Innovator of the Year award in 1999. Miguel has galvanized Ximian's efforts to make Linux accessible and available to the average computer user, and continues to reach out globally by working with international organizations such as eMexico to introduce affordable technology alternatives, like Linux, to other nations.

Get the latest 'Miguel de Icaza' items fed to you via RSS  Email Miguel de Icaza

 

Stories published from this author

De Icaza Strikes Out: "Cringley Is All Wrong" About Microsoft

In a sharp rebuttal of a Robert X. Cringley column last week sub-titled "The Only Way to Beat Microsoft Is To Ignore Microsoft" in which he argued that paying too much attention to Microsoft simply allows Microsoft to define the game "and when Microsoft gets to define the game, they ALWAYS win," Miguel de Icaza declares: "A nice statement, but nothing more than a nice statement, other than that, it's all incorrect. Microsoft has won in the past due to many factors, and none of them related to 'Let them define the game'." He goes on to give examples.

Keynote Panel : The Open Source Debate

This lively discussion will take on the controversial OS questions tha everyone is asking.

The Mono Project

In July 2001 we announced the launch of the Mono Project ­ an effort to build an open-source implementation of the Microsoft .NET Framework using the technical documents that Microsoft submitted to ECMA (the European Computer Manufacturers Association) for standarization.

 

 


Interview with Miguel de Icaza, co-founder of Gnome, Ximian and Mono

Around the Net

Born in Mexico City, Miguel de Icaza was the driving force behind the creation of the Gnome free software desktop, and co-founded the open source company Ximian, bought last August by Novell. In July 2001, he helped start another ambitious project, Mono: a free implementation for GNU/Linux of Microsoft's .Net framework. He talks to Glyn Moody about Mono's progress, how Ximian was bought by Novell, and why he is so scared of Microsoft's Longhorn.

Q. How has your vision of Mono changed since you began the project, and what are the main aims of Mono today?

A. A lot of the things that Microsoft was addressing with .Net were touching on existing pain points for us. We've been using C and C++ way too much - they're nice, but they're very close to the machine and what we wanted was to empower regular users to build applications for Linux. Windows has a lot of tools that address a particular problem but on Linux we're kind of on our own in terms of development So when Microsoft came out with this [.Net] thing, initially what we saw was very interesting, and that's how the project got started. But as people got together and started to work and collaborate on this effort, a couple of things happened.

The first one is that there was more and more momentum behind building APIs that were compatible with the Microsoft ones. Novell and Ximian were focused just on the core and C#; a lot of the people who came and contributed software to the project were interested in Windows Forms, or ASP.Net or Web services or databases, which were part of the Microsoft stack.

And at the same time we have grown organically a stack completely independent of the Microsoft stack, which we call the Mono stack but it includes things like tools for doing GUI development for Linux - that was one thing that we were very interested in and we actually invested a lot of effort into that.

So today at the core we still have Mono, which is what we wanted to do, and now we've got two very healthy independent stacks: the Microsoft-compatible stack for people who want to bring their applications from Windows to Linux, and also this completely new and fresh stack of things that in some cases are portable from Linux to Windows, and in some cases are very, very Linux specific.

Q. Microsoft doesn't seem to be making so much noise about .Net these days: what's your view of .Net's progress at the moment: how is it shaping up as a platform for writing software?

A. I think that Microsoft overdid what .Net was. Initially, the message was extremely confusing because .Net meant too many things. It meant a set of server products, it meant a special edition of Windows, it meant a development platform. When we refer to .Net we call it the .Net framework, which is what it actually was - it was only one of the components of .Net. And the .Net framework is actually a fairly good development platform. At least it has stopped the migration of people who were sick and tired of MFC to Java. Now they have a much better solution if they go with the Microsoft-based technology and there's a nice migration path. So I think they've been successful.

You probably don't hear too much about .Net because now it's not hot news, it's just something that developers have. And the other reason might be because they're going through another rebranding exercise within Microsoft - I think it's going to be massively complex. They're trying to rebrand .Net framework into WinFX - not to be confused with WinFS . WinFX is basically the new version of the .Net framework with things for Longhorn, so it includes WinFS, it includes Avalon, it includes communications. The whole Longhorn thing is built on top of .Net.

Q. One big change since Mono started has been Novell's acquisition of Ximian; how did it come about?

A. We had a product for Linux called Red Carpet. It's something for maintaining the software and software updates on Linux machines on servers and clients. We had a fantastic GUI, we had a command line tool that allowed administrators to schedule things so it can be completely unattended, you could centrally manage things, roll out deployments, back out, undo. We basically had a very good product for Linux, and Novell has an equivalent software product for Windows. So we were talking about how can we marry these products, how can we benefit from each other.

A few of their guys came. They were really trying to see how to position Novell and how they could ship services for Linux; they had a long-term plan. One day they came and presented and they said, well, here's the situation, we really want to get into Linux. I think you guys can help us.

Q. What did you find attractive in becoming part of Novell?

A. The negotiations, as I said, began with Red Carpet, but the Linux desktop was something that also fascinated them and they also wanted to become a player in the desktop space - we don't want to be relegated to the server space only, the client is where all the action is happening. So [being acquired] allowed us to restaff and continue working on the desktop without having to focus continuously on this [Red Carpet] thing.

Q. Has Novell decided what form the desktop will take - will it be Gnome-based, or have other elements?

A. We cannot choose one desktop over the other - Gnome or KDE - because there's users for both code bases. What we're planning on doing is we're working with an organisation called the Freedesktop.org. The idea there is they define protocols - things like desktop notification systems, system-wide configuration engine, clipboard support, drag and drop, etc. We're pushing those into Gnome and KDE to unify those things and the default desktop really is a combination of elements.

Gnome and KDE are basically the shells, but then there are higher-level applications like the office suite. We're making the decision it's going to be OpenOffice, the browser it's going to be Mozilla, the email client it's going to be Evolution, the IM client it's going to be Gaim. So we basically have to pick successful open source projects and put them together. There's a lot of work on integrating.

Q. You've drawn a distinction between a GNU/Linux desktop and an open source desktop: what did you mean by that?

A. In some cases we've been building tools that are specific to Linux for the desktop, and they only work on Linux, but I see two major projects that are wildly, wildly successful: Mozilla and OpenOffice, and those two programs are cross platform. So if we're going to build new applications that require a large time investment, like say movie editing - today that doesn't matter for the enterprise desktop, but eventually it will when we get closer to consumers - you really need to have a cross-platform story. We all love Linux, but it's also a fact that some people might not be able to migrate. But we can still have them reduce their TCO. You don't necessarily need Office, you don't necessarily need IE or you don't necessarily need Final Cut Pro etc. We should be thinking in terms of building cross-platform applications that will work fine on Windows and under MacOS as well.

Q. When does Novell think that the GNU/Linux desktop will be ready for the general user?

A. We're doing this exercise ourselves. I think that by October the whole company has to migrate to OpenOffice, and then I think it's by June next year we all migrate to Linux - you don't want to migrate 6,000 people both operating system and office suite in a single jump.

We have a lot of existing customers which are also considering Linux desktop migrations and rolling out some of these programs, so we're learning from them. For example, in Extramadura and Andalucia they've been going out for two years now with these deployments. They have 200,000 deployed seats and they're going towards 400,000 deployed seats by the end of the summer on pure Linux, Gnome, Mozilla, OpenOffice desktops. They're two very, very large deployments, probably the largest deployments of Linux desktops today. These are being used by kids, by grandmothers.

Q. What do you see as the greatest danger to the continuing adoption and progress of open source?

A. Microsoft realises today that Linux is competing for some of the green pastures that it's been enjoying for so long; I think that Longhorn is a big attempt to take back what they owned before. Longhorn has kind of a scary technology called Avalon, which when compounded with another technology called XAML, it's fairly dangerous. And the reason is that they've made it so it's basically an HTML replacement. The advantage is it's probably as easy as writing HTML, so that means that anybody can produce this content with a text editor.

It's basically an HTML Next Generation. A lot more widgets, a lot more flexibility, more richer experience - way, way richer experience. You get basically the native client experience with Web- like deployments. So you develop these extremely rich applications but they can be deployed as easily as the Web is. It's just like going to a URL: you go to Google, and you get the Web page and it works. So it's the same deployment model but the user interface interaction is just fantastic.

Of course, the only drawback is that this new interaction is completely tied to .Net and WinFX. So we see that as a very big danger. A lot of people today cannot migrate to Linux or cannot migrate to Mozilla because a lot of their internal Web sites happen to use IE extensions. Now imagine a world where you can only use XAML.

It's massive - I'm so scared.

Using the ECMA Standards: An Interview with Miguel de Icaza

Dare Obasanjo

December 2001

Summary: In this interview, Miguel de Icaza, the founder of GNOME and Ximian, talks about UNIX components, Bonobo, Mono, and Microsoft .NET. (6 printed pages)

Dare Obasanjo: You have recently been in the press due to Ximian's announcement that it shall create an open source implementation of the Microsoft .NET development platform. Before the recent furor you've been notable for the work you've done with GNOME and Bonobo. Can you give a brief overview of your involvement in free software from your earlier projects up to Mono?

Miguel de Icaza: I have been working for the past four years on the GNOME project in various areas: organization of it, libraries and applications. Before that I used to work on the Linux kernel, I worked for a long time on the SPARC port, then on the software raid and some on the Linux/SGI effort. Before that I had written the Midnight Commander file manager.

Dare Obasanjo: In your Let's Make Unix Not Suck series you mention that UNIX development has long been hampered by a lack of code reuse. You specifically mention Brad Cox's concept of Software Integrated Circuits, where software is built primarily by combining reusable components, as a vision of how code reuse should occur. Many have countered your arguments by stating that UNIX is built on the concept of using reusable components to build programs by connecting the output of smaller programs with pipes. What are your opinions of this counter-argument?

Miguel de Icaza: Well, the paper addresses that question in detail. A 'pipe' is hardly a complete component system. It is a transport mechanism that is used with some well-known protocols (lines, characters, buffers) to process information. The protocol only has a flow of information.

Details are on the paper. [Dare—Check the section entitled "Unix Components: Small is Beautiful."]

Dare Obasanjo: Bonobo was your attempt to create a UNIX component architecture using CORBA as the underlying base. What are the reasons you have decided to focus on Mono instead?

Miguel de Icaza: The GNOME project goal was to bring missing technologies to Unix and make it competitive in the current market place for desktop applications. We also realized early on that language independence was important, and that is why GNOME APIs were coded using a standard that allowed the APIs to be easily wrapped for other languages. Our APIs are available to most programming languages on Unix (Perl, Python, Scheme, C++, Objective-C, Ada).

Later on we decided to use better methods for encapsulating our APIs, and we started to use CORBA to define interfaces to components. We complemented it with policy and a set of standard GNOME interfaces for easily creating reusable, language independent components, controls and compound documents. This technology is known as Bonobo. Interfaces to Bonobo exist for C, Perl, Python, and Java.

CORBA is good when you define coarse interfaces, and most Bonobo interfaces are coarse. The only problem is that Bonobo/CORBA interfaces are not good for small interfaces. For example, an XML parsing Bonobo/CORBA component would be inefficient compared to a C API.

I also wrote at some point:

My interest in .NET comes from the attempts that we have made before in the GNOME project to achieve some of the things .NET does:

·        APIs that are exposed to multiple languages

·        Cross-language integration

·        Contract/interface based programming

And on top of things, I always loved various things about Java. I just did not love the Java combo that you were supposed to give or take.

We tried APIs exposed to many languages by having a common object base (GtkObject) and then following an API contract and a format that would allow others to wrap the APIs easily for their programming language. We even have a Scheme-based definition of the API that is used to generate wrappers on the fly. This solution is sub-optimal for many reasons.

The cross-language integration we have been doing with CORBA, sort of like COM, but with an imposed marshalling penalty. It works pretty well for non-inProc components. But for inProc components the story is pretty bad: since there was no CORBA ABI that we could use, the result is so horrible, that I have no words to describe it.

On top of this problem, we have a proliferation of libraries. Most of them follow our coding conventions pretty accurately. Every once in a while they either wouldn't or we would adopt a library written by someone else. This lead to a mix of libraries that, although powerful in result, implement multiple programming models, sometimes different allocation and ownership policies and after a while you are dealing with 5 different kind of "ref/unref" behaviors (CORBA local references, CORBA object references on Unknown objects, reference count on object wrappers) and this was turning into a gigantic mess.

We have of course been trying to fix all these issues, and things are looking better (the GNOME 2.x platform does solve many of these issues, but still).

.NET seemed to me like an upgrade for Win32 developers: they had the same problems we had when dealing with APIs that have been designed over many years, a great deal of inconsistency. So I want to have some of this new "fresh air" available for building my own applications.

Dare Obasanjo: Bonobo is slightly based on COM and OLE2 as can be gleaned from the fact that Bonobo interfaces are all based on the Bonobo::Unknown interface which provides two basic services: object lifetime management and object functionality-discovery and only contains three methods:

   module Bonobo {
      interface Unknown {
         void ref ();
         void unref ();
         Object query_interface (in string repoid);
      };
   };
      

which is very similar to Microsoft's COM IUnknown interface which has the following methods

HRESULT QueryInterface(REFIID riid, void **ppvObject);
ULONG AddRef();
ULONG Release();

Does the fact that .NET seems to imply that the end of COM is near mean that Mono will spell the end of Bonobo? Similarly considering that .NET plans to have semi-transparent COM/.NET interoperability, is there a similar plan for Mono and Bonobo?

Miguel de Icaza: Definitely. Mono will have to interoperate with a number of systems out there including Bonobo on GNOME.

Dare Obasanjo: A number of parties have claimed that the Microsoft NET platform is a poor clone of the Java™ platform. If this is the case why hasn't Ximian decided to clone or use the Java platform instead of cloning the Microsoft .NET platform?

Miguel de Icaza: We were interested in the CLR because it solves a problem that we face every day. The Java VM did not solve this problem.

Dare Obasanjo: On the Mono Rationale page it is pointed out that the Microsoft .NET strategy encompasses many efforts including:

·        The .NET development platform, a new platform for writing software

·        Web services

·        Microsoft Server Applications

·        New tools that use the new development platform

·        Hailstorm, the Microsoft .NET Passport-centralized single sign-on system that is being integrated into Microsoft Windows XP.

And you point out that Mono is merely an implementation of the .NET development platform. Is there any plan by Ximian to implement other parts of the .NET strategy?

Miguel de Icaza: Not at this point. We have a commitment to develop currently:

·        A CLI run time with a JITer for x86 CPUs

·        A C# compiler

·        A class library

All of the above with the help of external contributors. You have to understand that this is a big undertaking and that without the various people who have donated their time, expertise and code to the project we would not even have a chance of delivering a complete product any time soon.

We are doing this for selfish reasons: we want a better way of developing Linux and Unix applications ourselves and we see the CLI as such a thing.

That being said, Ximian being in the services and support business would not mind extending its effort towards making the Mono project tackle other things like porting to new platforms, or improving the JIT engine, or focusing on a particular area of Mono.

But other than this, we do not have plans at this point to go beyond the three basic announcements that we have made.

Dare Obasanjo: There are a number of other projects that are implementing other parts of .NET on free platforms that seem to be have friction with the Mono project. Section 7.2 of the Portable.NET FAQ seems to indicate they have had conflict with the Mono project as does the banning of Martin Coxall from the dotGNU mailing list. What are your thoughts on this?

Miguel de Icaza: I did not pay attention to the actual details of the banning of Martin from the DotGNU mailing lists. Usenet and Internet mailing lists are a culture of their own and I think this is just another instance of what usually happens on the Internet. It is definitely sad.

The focus of Mono and .NET is slightly different: we are writing as much as we can in a high-level language like C#, and writing reusable pieces of software out of it. Portable.NET is being written in C.

Dare Obasanjo: There have been conflicting reports about Ximian's relationship with Microsoft. On one hand there are reports that seem to indicate that there may be licensing problems between the license that will govern .NET and the GPL. On the other hand there is an indication that some within Microsoft are enthusiastic about Mono. So exactly what is Ximian's current relationship with Microsoft and what will be done to ensure that Mono does not violate Microsoft's licenses on .NET if they turn out to be restrictive?

Miguel de Icaza: Well, for one we are writing everything from scratch.

We are trying to stay on the safe side regarding patents. That means that we implement things in a way that has been used in the past and we are not doing tremendously elaborate or efficient things in Mono yet. We are still very far from that. But just using existing technologies and techniques.

Dare Obasanjo: It has been pointed out that Sun retracted Java from standards processes at least twice, will the Mono project continue if .NET stops being an open standard for any reason?

Miguel de Icaza: The upgrade on our development platform has a value independently of whether it is a standard or not. The fact that Microsoft has submitted its specifications to a standards body has helped, since people who know about these problems have looked at the problem and can pinpoint problems for interoperability.

Dare Obasanjo: Similarly what happens if Dan Kusnetzky's prediction comes true and Microsoft changes the .NET APIs in the future? Will the Mono project play catch up or will it become an incompatible implementation of .NET on UNIX platforms?

Miguel de Icaza: Microsoft is remarkably good at keeping their APIs backwards compatible (and this is one of the reasons I think they have had so much success as a platform vendor). So I think that this would not be a problem.

Now, even if this was a problem, it is always possible to have multiple implementations of the same APIs and use the correct one by choosing at run time the proper "assembly". Assemblies are a new way of dealing with software bundles and the files that are part of an assembly can be cryptographically checksummed and their APIs programmatically tested for compatibility.   [Dare—See the description of assemblies from the .NET Framework Glossary.]

So even if they deviate from the initial release, it would be possible to provide assemblies that are backwards compatible (we can both do that: Microsoft and ourselves)

Dare Obasanjo: Looking at the Mono class status page I noticed that a large number of .NET class libraries are not being implemented in Mono such as Windows Forms, ADO.NET, Web services, XML schemas, reflection and a number of others. This means that it is very likely that when Mono and .NET are finally released, applications written for .NET will not be portable to Mono. Is there any plan to rectify this in the future or is creating a portable .NET platform not a goal of the Mono project? Similarly what are the short and long term goals of the Mono project?

Miguel de Icaza: The status Web page reflects the classes that people have "requested" to work on. The status Web page is just a way of saying, "Hey, I am working on this class as of this date" to avoid code duplication. If someone registers their interest in working on something and they do not do something after some period of time, then we can reclaim the class.

We are on the very early stages of the project, so you do see more work going on the foundational classes than on the end-user classes.

I was not even expecting so many great and talented programmers to contribute so early in the project. My original prediction is that we would spend the first three months hacking on our own in public with no external contributions, but I have been proved wrong.

You have to realize that the goals of the Mono project are not only the goals of Ximian. Ximian has a set of goals, but every contributor to the project has his own goals: some people want to learn, some people like working on C#, some people want full .NET compatibility on Linux, some people want language independence, some people like to optimize code, some people like low-level programming and some people want to compete with Microsoft, some people like the way .NET services work.

So the direction of the project is steered by those that contribute to it. Many people are very interested in having a compatible .NET implementation for non-Windows platforms, and they are contributing towards filling those gaps.

Dare Obasanjo: How does Ximian plan to pay for the costs of developing Mono especially after the failure of a number of recent venture-funded, free software-based companies like Indrema, Eazel,and Great Bridge, and the fact that a sizable percentage of the remaining free software-based companies are on the ropes? Specifically how does Ximian plan to make money at free software in general and Mono in particular?

Miguel de Icaza: Ximian provides support and services. We announced a few of our services recently, and more products and services have been in the pipeline for quite a while and would be announced during the next six months.

Those we announced recently are:

·        Red Carpet Express: a subscription service for those who want reliable, high-speed access to the Red Carpet servers.

·        Red Carpet Corporate Connect: We modified our Red Carpet updater technology to help people manage networks of Linux workstations easily and to deploy and maintain custom software packages.

·        Support and services for the GNOME desktop and Evolution: Our latest boxed products are our way of selling support services for the various products we ship.

We have also been providing professional services and support for people integrating free software-based solutions.

The particular case of Mono is interesting. We are working on Mono to reduce our development costs. A very nice foundation has been laid of and submitted to ECMA. Now, with the help of other interested parties that also realize the power of it, we are developing the Mono run time and development tools to help us improve our productivity.

Indeed, the team working on Mono at Ximian is the same team that provided infrastructural help to the rest of the company in the past.

Dare Obasanjo: It is probably little known in some corners that you once interviewed with Microsoft to work on the SPARC port of Internet Explorer. Considering the impact you have had on the free software community since then, have you ever wondered what your life would have been like if you had become a Microsoft employee?

Miguel de Icaza: I have not given it a lot of thought, no. But I did ask everyone I interviewed at Microsoft to open source Internet Explorer, way before Netscape Communicator was open-sourced.

Dare Obasanjo is a senior at the Georgia Institute of Technology working towards his Bachelor of Science degree in computer science. He spends his free time posting to online forums like Slashdot, Kuro5hin and Advogato, as well as writing various articles on programming and software. He has interned for various companies including Radiant Systems, i2 Technologies and Microsoft, and is currently debating the merits of a graduate degree but will most likely end up in Redmond when his time at GA Tech is over.

De Icaza Strikes Out: "Cringley Is All Wrong" About Microsoft

April 26, 2004

Summary
In a sharp rebuttal of a Robert X. Cringley column last week sub-titled "The Only Way to Beat Microsoft Is To Ignore Microsoft" in which he argued that paying too much attention to Microsoft simply allows Microsoft to define the game "and when Microsoft gets to define the game, they ALWAYS win," Miguel de Icaza declares: "A nice statement, but nothing more than a nice statement, other than that, it's all incorrect. Microsoft has won in the past due to many factors, and none of them related to 'Let them define the game'."
He goes on to give examples.

read Story

Read
Story

email Story

Email
Story

print Story

Print
Story

feedback on Story

Read/Add
Feedback

see Story author

About
Miguel de Icaza

By Miguel de Icaza 
Page 1 of 1

Microsoft has won in the past due to many factors, and none of them related to "Let them define the game," a couple from a list of many:

  • They leveraged their monopoly to break into new markets. The most discussed one is when they used brute force and anti-competitive strategies to get their products into new markets, but in some other cases they got fairly good adoption of their products with little or no effort: just bundle it with Windows: MSN messenger, Media Player.
  • Competitors were outmaneuvered or were incompetent (See High Stakes No Prisoners).
  • People were sleeping at the wheel.

In 1993-1994, Linux had the promise of becoming the best desktop system. We had real multi-tasking, real 32-bit OS. Client and Server in the same system: Linux could be used as a server (file sharing, web serving), we could run DOS applications with dosemu. We had X11: could run applications remotely on a large server, and display on small machine. Linux quickly became a vibrant innovative community, and with virtual-desktops in our window managers, we could do things ten times as fast as Windows users! TeX was of course `much better than Windows, since it focuses on the content and the logical layout' and for those who did not like that, there was always the "Andrew" word processor. Tcl/Tk was as good as building apps with QuickBasic.

And then Microsoft released Windows 95.

  • A few years later, everyone is talking components: Netscape is putting IIOP on their client and server (ahead of their time, this later became popular as web-services on the browser); Xerox ILU; Bonobo; KParts; the Borland sponsored event to build a small component system that everyone agrees with; language bindings are at their top.

The consensus at that time? Whatever Microsoft is doing is just a thin layer on top of COM/DCOM/Windows DNA which to most of us means `same old, same old, we are innovating!'.

And then Microsoft comes up with .NET.

Does something like XAML matter? Not really. But it makes it simple to create relatively cute apps, by relatively newby users, in the same way anyone could build web pages with HTML.

Does Avalon really matter? Its a cute toolkit, with tons of widgetry, but nothing that we cant do on a weekend, right?

Does the fact that it's built on top of .NET matter? Well, you could argue it has some productivity advantages, security features and get into a long discussion of .NET vs Java, but that's besides the point.

Everyone is arguing about tiny bits of the equation `We have done that with Glade before!', `Gtk/Qt are cross-platform!', `We can get the same with good language bindings!', `We already have the widgets!', `Cairo is all we need', `What do users really want?' and of course `Dont let them define the game!'.

They are all fine points of view, but what makes Longhorn dangerous for the viability of Linux on the desktop is that the combination of Microsoft deployment power, XAML, Avalon and .NET is killer. It is what Java wanted to do with the Web, but with the channel to deploy it and the lessons learned from Java mistakes.

The combination means that Longhorn apps get the web-like deployment benefits: develop centrally, deploy centrally, and safely access any content with your browser.

The sandboxed execution in .NET [1] means that you can visit any web site and run local rich applications as oppposed to web applications without fearing about your data security: spyware, trojans and what have you.

Avalon means also that these new "Web" applications can visually integrate with your OS, that can use native dialogs can use the functionality in the OS (like the local contact picker).

And building fat-clients is arguably easier than building good looking, integrated, secure web applications (notice: applications, not static web pages).

And finally, Longhorn will get deployed, XAML/Avalon applications will be written, and people will consume them. The worst bit: people will expect their desktop to be able to access these "rich" sites. With 90% market share, it seems doable.

Will Avalon only run on Longhorn? Maybe. But do not count on that. Microsoft built IE4 for Windows 98, and later backported it to Windows 95, Windows 3.11 and moved it to HP-UX and Solaris.

The reason people are genuinely concerned and are discussing these issues is because they do not want to be caught sleeping at the wheel again.

Will this be the end of the world for Linux and the Mac? Not likely, many of us will continue using our regular applications, and enjoy our nicely usable and consistent desktops, but it will leave us out of some markets (just like it does today).

Btw, the Mozilla folks realized this already

[1] Although it was easy to see why .NET supported the Code Access Security (CAS) in .NET 1.0, there was no real use for it. With Longhorn/Avalon/XAML it becomes obvious why it was implemented.

Planning for the future

Although some of the discussion has centered around using a native toolkit like Gtk+/XUL to build a competitor that would have ISV sex-appeal, this is not a good foundation as it wont give us Web-like deployment (we need a stack that can be secured to run untrusted applications, and we need to be able to verify the code that is downloaded, which leaves us with Java or .NET).

The time is short, Microsoft will ship Avalon in 2-3 years, and they got a preview of the technology out.

I see two possible options:

  • Implement Avalon/XAML and ship it with Linux (with Mono).
  • Come up with our own, competitive stack.

I think someone will eventually implement Avalon (with or without the assistance of the Mono team), its just something that developers enjoy doing.

If we choose to go in our own direction, there are certain strengths in open source that we should employ to get to market quickly: requirements, design guidelines, key people who could contribute, compatibility requirements and deployment platforms.

We have been referring internally at Novell to the later approach as the Salvador platform (after a long debate about whether it should be called MiggyLon or Natalon).

We do not know if and when we would staff such an effort but its on the radar.

The patent issue

Are there patents in Avalon? It is likely that Microsoft will try to get some patents on it, but so far there are little or no new inventions on Avalon:

  • Canvas graphics, persistent objects (Tk Canvas, Gnome Canvas)
  • With AA/vector-based graphics (Gnome AA Canvas)
  • With animation (Nautilus Andy-branch Canvas items)
  • With Vector graphics (Gnome Print, librsvg)
  • With A 2D graphics model (PDF, Libart, Cairo)
  • With Web-like-Deployment security (SecureTcl, Tcl Plugin, Java)
  • XAML has been done before (Glade, Qt designer), but am not sure that XAML is the best "authoring" model yet. The power lies in their deployment power.


What do you think? Read / Join the
Feedback to this item.

Linus Torvalds (torvalds@transmeta.com)

The homepage of a WWW-illiterate


Why does this exist at all?

Frankly, I don't know. I got a default homepage (in Finnish) made automatically for me, and now I wonder what I should do with it. If you have any great suggestions, feel free to mail me, and I'll probably feel free to ignore you.

If you're looking for Linux information, you'll find more of it somewhere else, because I'm hopeless when it comes to documentation. You could try Lasu's Linux Index for starters

Linus v2.0

This is Patricia Miranda Torvalds, my daughter:

Gif of PatriciaYet Another Gif of Patricia

Linux Logo

See here for the penguin logos I like. I'm including an animated version here too, Blinking Penguin

Contact Information

Snail-mail (work):

Linus Torvalds

Transmeta Corp

3940 Freedom Circle

Santa Clara, CA 95054

USA

Phone (work):

+1 (408) 327 9830 x328

Email:

torvalds@transmeta.com

Update on Windows NT Server vs. Linux Performance and Capabilities
Posted: May 12, 1999

 

The Mindcraft Report
Mindcraft recently published a
studyOff-site link comparing the performance capabilities of Linux to Windows NT Server. The Linux community has voiced their opinions regarding the findings. The top Linux creators, including Linus Torvalds, Alan Cox, Jeremy Allison, and others have publicly stated their belief that the Mindcraft results don't accurately represent the performance capabilities of Linux. However, tests in PC WeekOff-site link and PC MagazineOff-site link corroborate the Mindcraft findings. The Linux community has asked Mindcraft:

§  To configure and tune the servers themselves.

§  To be present to ensure that the tests are conducted fairly.

§  To run the tests at a neutral location.

§  For a third party to witness and audit the results.

§  To run the tests on lower-end hardware.

§  To run the tests using clients running Windows NT Workstation, in addition to Windows 9x clients.

Mindcraft Delivers on the Demands of the Linux Community
To address the concerns raised by the Linux community and to prove the tests were run fairly and without bias, Mindcraft has offered to rerun the tests, encouraging the participation of Microsoft and the top Linux developers. As stated in Mindcraft's
Open BenchmarkOff-site link , the following requests of the Linux community have been met:

§  All tests will be redone allowing the top Linux folks to configure and tune the server.

§  Folks from Microsoft and the Linux community can be present for ALL testing.

§  The tests will be done at PC Week Labs, a Ziff-Davis publication.

§  PC Week will moderate and audit the testing.

§  Mindcraft has added a "low-end" configuration (single processor, 256MB).

§  Clients running Windows NT Workstation, in addition to Windows 9x, will be used for all tests.

Linux Community Slow to Respond
Even though their requests have been met, the Linux community has not officially accepted Mindcraft's offer. Furthermore,
RedHat CEO Robert Young'sOff-site link request "to bring a real benchmarking organization to this enterprise, like Ziff-Davis, or the Meta group," has also been met. It's time for the Linux folks to step up to the challenge and prove that Linux is capable of achieving better results than Windows NT Server. After all, this is the real issue.

Windows NT Server Performance Track Record
Unlike Linux, there is an abundance of evidence to support the performance and scalability capabilities of Windows NT Server. To date there is no performance data using industry benchmarks to support the Linux community's claim that Linux performs better than Windows NT Server on server hardware.

Purpose

Windows NT Server Results

Linux Results

TPC-COff-site link

§  Most widely recognized industry benchmark for measuring database (online transaction processing) performance.

§  Reports transaction per minute (tpmC)

§  Reports of overall cost per transaction ($/tpmC).

§  Top 10 TPC-C price performance solutions

§  Top results on single, dual, and quad processor servers

§  Results form most major Windows NT Server database vendors.

None - Linux and Linux database vendors have yet to post even one TPC-C result

SpecWebOff-site link

§  Widely recognized industry benchmark for measure the static performance capabilities of a Web server.

§  Best dual and quad processor results.

§  Results from most major server hardware vendors.

None - Linux has yet to post SPECWeb results

NetBenchOff-site link

§  NetBench is a portable benchmark program that measures how well a file server handles file I/O requests from 32-bit Windows clients, which pelt the server with requests for network file operations.

§  NetBench reports throughput in Megabits per second (Mbps) and client response time measurements.

§  PC WeekOff-site link - 330Mbps (69% faster than Linux)

§  MindcraftOff-site link - 286Mbps (151% faster than Linux)

§  PC WeekOff-site link - 195Mbps

§  MindcraftOff-site link - 114Mbps

WebBenchOff-site link

§  WebBench measures Web server performance. Unlike SPECWeb, it is capable of measure static, dynamic, static/dynamic mix, and e-commerce (SSL) workloads

§  WebBench reports the number of client requests per second.

§  PC Week (static)Off-site link - 4000 Requests/sec (90% faster than Linux)

§  PC Magazine (static)Off-site link - 3000 Requests/sec (233% faster than Linux)

§  Mindcraft (static)Off-site link - 3770 Requests/sec (277% faster than Linux)

§  PC Magazine (static/dynamic mix)Off-site link - 2250 Requests/sec (650% faster than Linux)

§  PC Magazine (e-commerce - SSL)Off-site link - 250 Requests/sec (680% faster than Linux)

§  PC Week (static)Off-site link - 2100 Requests/sec

§  PC MagazineOff-site link - 900 Requests/sec

§  Mindcraft (static)Off-site link - 1000 Requests/sec

§  PC Magazine (static/dynamic mix)Off-site link - 300 Requests/sec

§  PC Magazine (e-commerce - SSL)Off-site link - 1950 Requests/sec

 

It's About More than Performance
Although performance is important when evaluating server operating systems, there are other equally or more important capabilities that should be considered.
These include:

Customer Requirement

Windows NT Server 4.0

Linux

Reliability - Guaranteed server uptime

§  A number of OEMs offering 99.9% uptime guarantees on Windows NT Server 4.01

§  Support for high availability application clustering and TCP/IP-based load balancing

§  Journaling file system for file-level reliability and recoverability

§  No OEM guarantees uptime on Linux systems

§  Lack of an enterprise clustering system for service and application availability

§  Lack of extensive testing to guarantee compatibility across components and applications

§  Lack of a Journaling file system - file system may not recover after unplanned downtime

Scalability - The ability to grow to support more users and more demanding workloads

§  Supports 4GB of RAM by default (2GB Kernel and 2GB User/Application)- Up to 3GB available for memory intensive applications such as databases

§  NTFS provides a 64-bit file system which is capable of file sizes up to 264 (much larger than 2GB)

§  Integrated file cache for faster access to commonly used files

§  Asynchronous I/O - Threads can process other tasks while waiting on I/O thus improving performance and scalability

§  Limited to 2GB of physical memory

§  Limited to a maximum file size of 2GB

§  Synchronous I/O - Introduces I/O contention thus limiting the SMP scalability

§  Lack of kernel-level threading model for more efficient application processing

Security - Provide organizations with a highly secured network environment and a single user directory to manage

§  Single, secure sign-on across the multiple servers in a networked environment

§  System services run in a secure context providing higher levels of security for multi-user services

§  Inherits the security flaws of UNIX (i.e., easy to gain root access via poorly written applications).

§  No resolution path (e.g., methodology) for bug fixes with clear accountability.

§  No centralized security - users must manually synchronize user accounts across servers

§  More prone to security bugs2

Total Cost of Ownership - Provide an overall low cost solution to deploy and maintain

§  Overall, 37% less expensive to set up and operate than UNIX.

§  26% less expensive to set up and integrate than UNIX

§  27% less expensive to administer than UNIX

§  Inherits the high setup, integration, and maintenance costs associated with setting up and managing a UNIX environment

§  Low degree of integration increases costs and technical risk

Application Availability - Provide a wide range of OS-integrated applications to reduce the cost of deploying and managing business solutions

§  Over 8,000 Windows NT compatible applications available

§  Over 4,000 server-based applications

§  650 carry the Designed for BackOffice® Logo, offering directory and security integration

§  Extensive internal and external beta testing to ensure binary compatibility across services and applications

§  Hundreds of available applications

§  No certification process for applications

§  No commitment to binary backward compatibility

§  Historically, in order to perform optimally, applications need to be recompiled when the OS is upgraded

Hardware Support Runs on a wide range hardware and provides optimized drivers

§  Support for latest hardware innovations

§  Support for 24K devices - 15K with Logo

§  Driver Development Kit to assist hardware vendors develop device drivers

§  Limited hardware driver support

§  Not optimized for high-end servers

Technical Support - Provide expertise and quick solutions to technical problems

§  Dedicated support network

§  350K Microsoft Trained Professionals

§  160K Microsoft Certified Engineers

§  Support through partners and OEMs

§  "Peer-to-peer" support, gaining some momentum with industry hardware OEMs (Compaq, IBM, etc.)

§  No formalized field training

Ease of Use - Reduce the time it takes to learn, setup and manage the OS to make it available to a greater number of users

§  Integrated platform built around ease of use

§  GUI-based tools

§  Wizards to simplify complicated tasks

§  Scriptable administration for automated local and remote management

§  Need highly trained system administrators - usually require developer-level skills

§  Administrators are required to re-link and reload kernel to add features to OS.

§  Most configuration settings require editing of text-based files

§  When available, no commonality between GUI-based tools

Integration of system services and applications to reduce complexity and management costs

§  OS services and applications designed to integrated and work together

§  Integrated security across OS services and applications

§  Common management and application services across client and server

§  OS services provided as an un-integrated collection of technologies developed by independent developers

§  Open questions about internationalization, access by people with disabilities

§  End users forced to integrate (i.e., Web server, database, application authentication)

Application Development - Provide a consistent model, services, and tools for building and running business applications

§  Integrated component model and server application services.

§  Web applications server framework

§  Integrated message-queuing services and transaction-processing services

§  Broad language support, including Java

§  Database interoperability with distributed transaction support (DTC)

§  Provides source code to allow developers to deviate from standard distribution

§  Typical UNIX development - scripting together of C executables using Perl and other scripting languages

§  No application framework for developing distributed or Web-based applications

§  Poor support for Java

Road Map - Allows customers to plan future deployments

§  Clear longterm roadmap based on a customer focused vision

§  Over $2 Billion in R&D spending by Microsoft against the roadmap combined with even greater investments by ISV's and OEM's to evolve the platform

§  No long term roadmap - features get added based on coding interests of the OSS development community as well as their willingness to implement them

Internationalized - Available in different language versions

§  Localized in 14 languages

§  Deep UNICODE support throughout system

§  Mixed support - some components are internationalized, some not

§  No formal program to deliver internationalized versions

1 Current announced vendors include HP, Compaq, Data General, and IBM
2 See
www.lwn.netOffsite link

 

Conclusion
Although the Linux community is focusing on the messenger and not the message, Mindcraft has graciously agreed to rerun the tests at their own expense accommodating the Linux community. Now it's time for the Linux community to demonstrate the real performance and scalability capabilities of Linux, or withdraw their criticisms of the initial Mindcraft report.

 


 Last Updated: Thursday, March 28, 2002
 
© 2003 Microsoft Corporation. All rights reserved. Terms of Use.
 
 

Sigue Linux en la revolución de la industria

Miguel de Icaza busca crear a través del software una mayor equidad en el acceso a la tecnología

Jorge Arredondo Pineda
EL UNIVERSAL
Lunes 27 de septiembre de 2004

 

 

Notas Relacionadas

Desde que el mexicano Miguel de Icaza inició con su proyecto en el desarrollo de sistemas basados en Linux, su propósito fue crear tecnología tanto para quienes pueden pagarla como para los que no.

No obstante que su empresa Ximian fue adquirida por la estadounidense Novell, su meta se cumple: "el 80% del software que desarrollamos es para la gente, son proyectos que están disponibles para cualquiera; el 20% son programas propietarios que comercializa Novell y que le permiten seguir desarrollando la empresa".

Icaza es uno de los más importantes desarrolladores de Linux en el mundo. Sus dos proyectos, MONO y GNOME, pueden revolucionar la industria de tecnologías de la información. El primero establece un puente a fin de que los proyectos para los sistemas .Net de Microsoft puedan correr en sistemas de código abierto, y el segundo ofrece una alternativa respecto a Windows en las computadoras.

Así, el ahora vicepresidente de Tecnología de Productos Novell, abordó de manera constante temas de tecnología y problemáticas político-económicas, en las que se manifiesta en contra de los lineamientos neoliberalistas y de la política en México.

"En el proyecto de e-México Linux no falló, no nos dieron siquiera la oportunidad de probarnos. Cuando hablé con Julio César Margain (ex coordinador del programa gurbernamental) me preguntaron cuánto podíamos aportar a este proyecto; Microsoft ya había hecho un aporte de alrededor de 45 millones de dólares, mitad en licencias y en capacitación", explicó Icaza.

Agregó que, no obstante, la donación de Microsoft era significativa pero se tenía que pagar otra cantidad por más licencias; así que aunque Linux no ofrecía una cantidad en dinero, pero representaba una inversión menor y los ahorros eran más importantes que la donación realizada por Microsoft. Además —continuó Icaza—, la implementación de Linux se queda en la comunidad de desarrolladores locales y no es dinero que va al extranjero.

El reto es que el usuario encuentre en Linux todo lo que necesita.

 

jsbb writes "

"Las corporaciones tienen un mandato, que es maximizar el regreso en la inversión, pero de cierta forma el software libre ha obligado a estas compañías a tomar un rol social en contra de su instinto de 'lock in', porque si pudieran hacer de Linux una propiedad suya lo harían". Miguel de Icaza

Por José Eseverri

Reforma

Toda revolución tiene sus íconos. Y el movimiento de software libre ha encontrado uno en Miguel de Icaza, pero no un héroe ni un mártir.

A sus 32 años, el hacker mexicano sigue promoviendo a Linux como alternativa a la "tiranía" del software propietario con la misma pasión de cuando era estudiante en la UNAM, sin importar que hoy ocupe un lugar clave en la estrategia de una corporación global.

Miguel de Icaza encarna como nadie la evolución que ha sufrido Linux en una década, de ser poco más que el juguete de un puñado de programadores idealistas que intercambiaban código salpicado de opiniones políticas en internet, hasta convertirse en una seria amenaza al dominio de Microsoft.

Como actual vicepresidente de plataformas de desarrollo en Novell, ha sido señalado por radicales del movimiento por ceder a los encantos del mundo corporativo, pero De Icaza tiene otra revolución en mente.

En entrevista con INTERFASE, el programador sugiere que Linux ha tomado por asalto a los grandes capitales en las tecnologías de información, obligando a las corporaciones a actuar contra sus instintos y tomar una mayor responsabilidad social.

La historia de Miguel va de la mano con la de Linux, desde que un estudiante finlandés, Linus Torvalds, hizo público el kernel o corazón del sistema operativo en 1991 y Richard Stallman, desde el MIT, le diera protección legal al acuñar la licencia GPL (Licencia Pública General) que permite que el código pueda ser libremente distribuido.

En el pasado, sin embargo, se necesitaba un considerable conocimiento de cómputo para configurar y utilizar Linux, por lo que sus usuarios caían bajo la etiqueta de "hacker" o "geek".

De Icaza se dio cuenta que si Linux iba a ser relevante tendría que parecerse a sistemas operativos con interfaz gráfica, como Windows o MacOS, para ser atractivo al usuario común, lo que dio origen al ambiente "GNOME".

Reconociendo que la ventaja no siempre la tienen los más fuertes sino los que son capaces de adaptarse a su entorno, Miguel ha inyectado a Linux un instinto de supervivencia del cual carecía.

Ese sentido evolutivo se ve reflejado en los proyectos que ha encabezado, desde GNOME y su primera compañía Helix Code (código hélice) hasta la curiosa obsesión por nombres de primates (Bonobo, Ximian) cuyo último eslabón es Mono, una plataforma abierta de desarrollo similar al ambiente .NET de Microsoft.

Si Mono tiene el éxito que el programador espera, el abismo de aplicaciones que separa a Linux de las necesidades del usuario común se acortaría, porque los programadores podrían desarrollar para ambos ambientes.

Pero sobre todo, al ser libre, ayudaría a cerrar la brecha entre potencias tecnológicas y países en desarrollo como México, asegura.

De Icaza parece tener dominado un acto de equilibrio entre su posición como directivo en Novell y su reputación de "hacker", en el sentido de programador virtuoso, en la comunidad de software libre.

"Yo no creo que estén peleados", dice, "efectivamente, no todo el software de Novell es libre, pero creo que es un 'hack' hermoso al capitalismo porque ahora para subsistir necesita al software libre".

"Seguramente hay un instinto de colaboración desarrollándose en las corporaciones de la misma manera en la que se desarrolla entre los individuos, lo que demuestra la idea dar- winiana que aquello que solamente está en el interés propio no funciona a la larga para la especie".

Aunque ahora Linux cuenta con el respaldo de gigantes como IBM, Oracle, Intel, Redhat y el propio Novell, lo que sigue haciendo posible su avance es la contribución de miles de programadores que dedican horas de su tiempo libre a mejorar el sistema operativo.

De acuerdo con De Icaza las motivaciones para contribuir son varias y no siempre tienen que ver con razones altruistas y tampoco con ambición de fama.

"Siempre se creyó que era la cuestión de la egolatría, pero cuando le preguntas a la gente por qué lo hace, encuentras todo tipo de explicaciones".

Un estudio realizado por el Boston Consulting Group encontró que la mayoría de los desarrolladores buscaban en Linux un estímulo intelectual, mientras que sólo un 11 por ciento dijo que lo motivaba el odio a Microsoft.

"Así que no todo mundo en el comunidad de software libre es un talibán. Yo creo que eso refleja las personalidades de cada quién".

Conéctate

Éstos son los proyectos donde participa Miguel de Icaza:

primates.ximian.com/~miguel www.mono-project.com

ESTA NOTA PUEDES ENCONTRARLA EN: http://www.reforma.com/edicionimpresa/notas/041018/interfase/550669.htm

Fecha de publicación: 2004-10-18 "

 https://www.osnews.com/story/1499/mini-interview-with-miguel-de-icaza/

 

 


No hay comentarios:

Publicar un comentario

zen consultora

Blogger Widgets

Entrada destacada

Platzy y el payaso Freddy Vega, PLATZI APESTA, PLATZI NO SIRVE, PLATZI ES UNA ESTAFA

  Platzy y los payasos fredy vega y cvander parte 1,  PLATZI ES UNA ESTAFA Hola amigos, este post va a ir creciendo conforme vaya escribiend...