Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Sunday, October 9, 2011

Not Invented Here Syndrome, why reinvent the wheel?


From my point of view Not Invented Here Syndrome is both a great anti-pattern and a way to be very productive in short term.

When developing application without using third party software (libraries, platforms or APIs) there is nothing to stop you from delivering the SW. Except for your own inability to design and implement it. I'm not talking about the fact that developers don't have significant experience with the 3rd party SW and they spent time, money and effort learning and getting used to it. At least learning and investigating new SW is not a waste of time but it's always a new experience. Also there are pieces of software like Spring (Spring helps with everything and is very mature and stable) or some MVC and UI technologies (you will hardly work with servlet api directly when building web application) that might be considered an exception.

I'm talking about software, that you could either substitute (implement from scratch) easily or software that contains a few features and abilities that you need and the rest of it you'll probably never use. If you find yourself standing before this question and you decide to use this software, there are two resolutions depending on the quality of its opensource community, because you would most probably need to make modifications to it, deliver a good patch and make others to apply it.

  1.    Either you hit road block. You come across serious bugs, idiosyncrasies and bending SW is not possible because nobody listens or it is a remote API that you don't have sources available to. Or suddenly after further experience it is not what you expected. What now? Weeks of work have been done, you can't make such a huge step back, refactoring would be awful. You get stuck.
  2.    The opensource community is good, software is perspective with good architecture and devoted developers and project leads, that responds to what you have to say. To define a good community is hard, because in huge, active community with thousands of users and developers nobody will listen to you soon unless you make a significant effort. In small but active community of mostly developers you get immediate response and you deal with everything very quickly. 

Not using open source SW and reinventing the wheel is the anti-pattern. Because instead of improving and testing an existing universal and generic software, you make a new custom piece of software that suits your needs.
   On the other hand you can't hit the road blocks, you don't need anything from anybody and you won't have to wait entire days or weeks for response.
   Big companies like Google, IBM, Red Hat etc. usually have Not Invented Here Syndrome. I guess they have to from various reasons.The wheel is being reinvented on daily basis. I'm very grateful that there are hundreds of specifications and standards for Java development that reduce the consequences of this.

To exemplify these facts on two pieces of software :

Google APIs are great, you can think of hundreds of use cases of their utilization. Fancy AtomPub based GData protocol, tons of client libraries, what not. But after all they are very buggy, at least from my perspective. Right now I'm using Google Translator Toolkit API that has such a tremendous bug that you can hardly believe. The funny thing is that GTT is in "Labs" stage and the API is restricted and might be shut down in December 2011. Anyway the bug consists in that fact that API results do not correspond to what you see in Google Translator Toolkit. There are 2 documents listed, API resolves one or none. You delete a document, GTT still lists it whereas API does not. It's like nobody is testing or using it what so ever, because they would have seen that. Now what? I merely get any response from Google and their engineers as to concrete API issue reports. Road block.

Liferay might be another example. Liferay is a sophisticated and mature piece of software. But think good what you expect and need from it ! Do you want to use it because some of its plugins? You need groups, roles and you'd have to implement permissioning system on resources, but Liferay already has that implemented, so why would you? You go for Liferay and you get incredible platform but tons of road blocks as well. So you better be sure that you really need a full-fledged Portal or professional Web Content Management system.

Friday, October 7, 2011

Java 8 backwards compatibility

I really hope that Oracle JVM engineers will decide to ignore backwards compatibility for Java 8 release. It is a great burden for the language in past releases and for developers too. The language is and was overly complicated and honoring backward compatibility is totally counterproductive.

For instance, consider Generics that were introduced in Java 5. It came with Type Erasure due to backwards compatibility with Java 1.4 and it was delivered exactly 7 years ago. The problem relates to the fact, that JVM engineers hadn't free hands and came up with quite complicated solution that often even the smart dudes don't get at some point. Type Erasure vanishes all dynamic type information that you could otherwise use in runtime, especially in some data binding libraries for instance. All static type information in class declaration are available via reflection :

   public class Test implements Superclass<String> { 
        // String retained, accessible via Class.getGenericSuperclass()
        
        // Integer retained (via Field.getGenericType())
        private List<Integer> l;

        // Long retained (via Method.getGenericParameterTypes())
        public void test(List<Long> l) {} 

        // Character retained (via Method.getGenericReturnType())
        public List<Character> test() { return null; }
    }

but dynamic type information is erased. If you create a class instance that carries dynamic type information T, there is no way to resolve that in runtime, speaking of JVM built-in ways of course.

public class Feed<T extends Entry> {
        // T is dynamic, only known at runtime
 public List<T> entries = new ArrayList<T>();
        Object o = new ArrayList<String>(); // String erased
}

There are workarounds like Super Tyle Tokens and Type Literals that are utilized in Guice - TypeLiteral or Jackson - TypeReference. So that you can create anonymous class with static Type information :

       // now the Type is obtainable
       TypeLiteral<List<String>> list = new TypeLiteral<List<String>>() { };

that is passed into method instead of List<String>.class (that won't compile) and that can be used to resolve the type via reflection later. This was practically one of suggested new features for Java 7 - enhancing java.lang.reflect.Type with generics as it was done with java.lang.Class. Then List<String>.class would be Type<List<String>>. But it didn't make it through. It would be a "patch" anyway.
    A patch for design flaws caused be tolerating backwards compatibility for two releases with distance decade long from each other. Even after 7 years the design decisions that were made were highly influenced by the fact that Java 7 must be backwards compatible with Java 4. Why ??? If a company has an old java 4 project, why would it have jvm 7 on the production server, it won't do for a year at least anyway even if it is said to be 100% backwards compatible. The company just shouldn't deploy java 7 apps in there, that's it. Why the heck it is necessary to make language design restrictions because of that. Well, I must admit that I understand these decisions a little bit in case of Java 7, but I really hope in lesser backwards compatibility in Java 8. It's a simple question : do we provide companies with the comfort of running apps/libraries that are 10 years older/younger that each other OR we lighten the burden on developers shoulders and make the language less complex and easier to use... I say f*** the companies...These attempts to make apps live longer than 10 years has no sense. If it is framework, platform or so and it is successful it would be refactored x times in the mean time. If it is a banking software, f*** it, rewrite it using brand new technologies for the same costs that would be spent for its maintenance.

I mean if I find myself upset about some serious design flaws in project Lambda in 2013, that exist just because of backwards compatibility with java 1.4, I will be really disappointed.

A lot is happening these days and Oracle must pay a very good attention to keep up. Take a look at the role of scripting technologies both on client and server side. Google came up with its V8 javascript engine which is utilized by Google Chrome and can be used by anyone because of its framework nature. Node.js is using it. Very successful Google Web Toolkit for building enterprise UIs deploys client side part of apps for Javascript engines. Oracle responds with project Nashorn in this area. Nashorn Javascript Engine will be a successor to Rhino engine that serves mainly for server side JS compilation into bytecode providing reflection and other cool things. YUI compressor is using it. And guess what, Google is introducing Google Dart with the goal to replace javascript, providing even cross compiler that will be able to compile Dart to ECMAScript 3 on the fly. Google was encouraging other browsers to integrate a native VM into their engines which is in progress in Google Chrome now.
   This all shows how Java and JVM is important and almost immortal, but also it shows how it can be avoided in various areas and substituted by C/C++. V8 engine would be a really suitable adept for JVM, though C++ was chosen for its implementation, maybe because it is used by Chrome. Future of Javascript is to be threatened by Google Dart in far future though Oracle is working on new Javascript Engine implemented from scratch. etc. etc...

A lot of reasons for preferring platform evolution over backwards compatibility, isn't it ?