Forum Stats

  • 3,826,348 Users
  • 2,260,632 Discussions
  • 7,896,912 Comments

Discussions

Default methods

baftos
baftos Member Posts: 3,431
edited Apr 9, 2014 6:24AM in Java 8 Questions

I read the Java tutorial on default methods and I think I understand how interfaces may evolve without loss of backward compatibility. My question refers rather to how should we design our interfaces from scratch. If and when should we define default methods? As an extreme example, why not design MyNewInterface with all default methods? If some operation makes sense, fine, stick the appropriate code in there, and if not, just do nothing, return null, return 0, return false. This would give the implementers of my interface quite some freedom what to care about, depending on their application needs and would make their apps less verbose (kind of like what the awt Adapter classes achieve). I don't look for an approval of this approach, but at some guidelines. Hope this makes sense.

Edit: And one more question from someone involved a lot with libraries for external clients. Does it make sense to rewrite old interfaces in a new release of the library and turn some methods into default methods containing code that makes sense? This option was not available so far and users had to write sometimes trivial implementations.

Answers

  • James_D
    James_D Member Posts: 1,496 Gold Trophy

    Here's my 2 cents on this. I need to start by saying I've been saying this on a couple of forums recently, and no-one else seems to agree with me, so you should probably read this with that borne in mind.

    Adding default methods to existing interfaces does not guarantee to preserve binary compatibility. Suppose you add a default method (call it m) to an existing interface I. If there happens to be a class C already in existence that implements both I and an interface I2, and I2 has a default method with an identical signature to m, then the class C which previously ran fine will produce runtime errors when run against the new version of I. (The problem is that you introduce an ambiguity in the resolution of the implementation of the method m.)

    Brian Goetz (who is far, far more experienced in the library and language design than I will ever be) describes this in section 9 of Interface Evolution via Virtual Extension Methods. He argues that this is going to be a rare case, and "only" occurs when classes are compiled independently of each other.

    I'm not sure I can dismiss this so casually. Every Java application uses classes compiled independently of each other: the application classes themselves are compiled independently of the core library classes (and in addition, most, if not all, enterprise applications use a plethora of third party libraries that are all compiled independently: Spring, Hibernate, EE App server implementations of JEE specifications, etc etc).

    To make the problem clear, if Java 9 introduces a default method to an existing interface, it raises a possibility that an application that ran under Java 8 would not run under Java 9.

    Moreover, I'm not convinced the exact collision of method signatures that is needed in order to cause this issue is going to be as rare as it might appear. For example, default methods added to interfaces in the next release of Java are likely to be methods that many users have found to be "missing" from the core libraries. In those cases, many "in-house", and perhaps some third-party libraries, will already be in place filling those gaps. So I think this could be a real problem.

    We maintain a modest-sized in-house library for our work. My approach is going to be that I add default methods to existing interfaces only as a last resort. I will likely also think quite hard about defining classes that implement multiple interfaces, if those interfaces are from independently-maintained libraries (if both interfaces introduce default methods with the same signature, my classes will break).

    As I say, smarter people than me don't share these concerns, but you might want to think a bit about this.

    James_D
  • baftos
    baftos Member Posts: 3,431

    Me thinks, what you describe is a collision between method signatures coming from two interfaces. As you, I see this as a problem, but it was there before and it is still there now. I think c# has means to deal with this (by qualifying the method name with the interface name, but I am not sure at all). Default methods don't seem to do anything to fix this problem, so my question still stands: cool stuff for backward compatibility, but how should we design new interfaces?

  • James_D
    James_D Member Posts: 1,496 Gold Trophy
    As you, I see this as a problem, but it was there before and it is still there now.

    No, it wasn't there before.

    Before default methods, all methods in an interface had to be abstract. So if a (concrete) class implemented two or more interfaces with the same method signature, it had to define its own implementation of a method with that signature, which resolves the ambiguity.

    With default methods, a class can implement multiple interfaces with the same default method. This is actually a compile error if the class is compiled when both interfaces have the default methods at the time of compilation of the class, but if the default method is introduced into one of the interfaces after the class is compiled, it will break an existing class.

    Default methods genuinely introduce multiple inheritance of behavior into Java. This has been known to have problems in other languages, but a cursory reading of the literature surrounding this release implies that the Java team have done this in a way that somehow avoids all these problems. They haven't: they arguably have minimized them, but they have definitely introduced problems associated with multiple inheritance of behavior into the language.

    Bottom line: introducing default methods to existing interfaces does not guarantee backward compatibility. Read the "small print" and you'll see this is the case. Treat with caution, and sure, use it if you understand that, but make sure you understand the implications before you get into it.

    I'm kind of intrigued by where the Java team is going with this, because I'm certain they know they have significantly weakened the guarantees of backward compatibility that were around before this. Perhaps they've internally decided a future release (Java 10) will not be back-compatible and are starting to play a little more loosely with language changes in light of that. That's speculation: but whatever's going on this marks a bit of a change of philosophy for Java.

    James_D
  • James_D
    James_D Member Posts: 1,496 Gold Trophy
    edited Mar 27, 2014 10:43PM

    So:

    cool stuff for backward compatibility

    probably not as compatible as you think

    but...

    how should we design new interfaces?

    For *new* interfaces, including default methods is not going to break anything (at least, not immediately, see later). I think, use this sparingly: if you genuinely have an example of "every object of this type should have this functionality", *and* there's an obvious default implementation, then that's probably a good case for using default methods.


    So , consider a DAO interface. You'll commonly want to get an object via it's id. The database stores the id as an int, and the corresponding class has an id field of type Integer. So it's fairly intuitive to define


    public interface DAO {
        public Entity getEntity(int id) ;
    }
    

    But... all (or the vast majority) of my use cases are in a web application, and of course the only data type http knows about is String. So at some point, I get a request parameter representing the id as a String, and have to convert it to an int in order to get the entity from the DAO. In other words, I end up doing this:

    DAO dao ;
    
    String idFromRequest = request.getParameter("id");
    Entity entity = dao.getEntity(Integer.parseInt(idFromRequest));
    

    I end up really needing DAO to support a

    public Entity getEntity(String id);
    

    method, but do I really want to force all implementations to implement it to call Integer.parseInt(...) to invoke the other method?

    I could, of course, define an AbstractDAO class that adds the method in, but then I lose the whole "programming to interfaces" paradigm.

    So that would (at least, on the surface) appear to be a nice use case for default methods.

    You should still worry a bit. Even if I put a default method into a new interface, any classes that implement that interface and some other interface are vulnerable to breaking if the other interface later introduces a default method with the same signature as my default method. So you might want to think about making those default method names quite specific.

    Default methods should be used sparingly. If you add dozens of them to your libraries, you increase the risk of problems caused later by classes not being able to resolve calls unambiguously.

    James_D
  • jwenting
    jwenting Member Posts: 4,864 Gold Badge
    edited Apr 1, 2014 5:15AM

    Even worse, "default methods", aka adding implementations to interfaces, is nothing more or less than introducing multiple inheritance at class level through the backdoor into Java.

    You can now simply create 2 interfaces, each with a "default method" with the same name...

    Mind I haven't tested this (we're still on Java 6, with no plans to upgrade to even 7 for the time being let alone 8, and can't install other JDKs than those included with JDeveloper), but it looks scarily like ye olde diamond problem.

    public interface I1 {
        default String getX() {  
            return "Hello World!";
        }
    }
    
    public interface I2 {
        default String getX() {
            return "I can't let you do that Dave...";
        }
    }
    
    public class C implements I1, I2 {
        public static void main(String... args) {
            System.out.println(new C().getX());
        }
    }
    
    

    Hard enough to control in your own teams and internal libraries if you want to avoid it. When pulling in external libraries from multiple parties the chaos is complete.

    Mind the compiler is supposed to catch this and refuse to compile it, but that doesn't resolve the problem that it is quite possible to now have such clashes when trying to import things from multiple sources, clashes that couldn't exist in the past because there was no multiple inheritance in Java at class level.

  • James_D
    James_D Member Posts: 1,496 Gold Trophy

    The compiler will catch this if all are compiled together, or if you compile C.java against the class files from I1 and I2 after they both have the default method defined. It's "only" problematic if the default method is introduced into one of the interfaces after compilation of the other interface and the class. The docs acknowledge this if you dig deep enough.

    I get why these were introduced. We actually use streams a lot for our internal work, and have been doing for a while now (using the ea releases). The ease of parallelization of our code has had a huge effect on what we are able to do. And the Streams API simply wouldn't have been viable without adding default methods to interfaces (possible, yes; viable, no). But I worry about the way these have been sold as "it's now completely safe to add functionality to interfaces" as though Oracle have somehow mysteriously solved all the problems inherent in multiple inheritance. These should have a big red flag on them saying "This is not a back-compatible change: use with caution and sparingly".

    Again, just my $0.02 worth.

  • jwenting
    jwenting Member Posts: 4,864 Gold Badge
    public interface I1 {
        default String getX() {   
            return "Hello World!";
        }
    }
    
    public interface I2 {
        default String getY() {
            return "I can't let you do that Dave...";
        }
    }
    
    public class C1 implements I1 {}
    public class C2 implements I2 {}
    public class C3 implements I1, I2 {}
    
    public class C implements I1, I2 {
        public static void main(String... args) {
            C3 c = new C3();       
            System.out.println(c.getX());
            System.out.println(c.getY());
                I1 a = (I1)c;
            // this generates a ClassCastException in the equivalent Java6 code
                C1 x = (C1)a;
                System.out.println(x.x());
        }
    }
    

    Not sure what this'd do when trying to run under Java 8. In Java 6, line 24 fails but I'd not be surprised if in Java 8 it works...

  • James_D
    James_D Member Posts: 1,496 Gold Trophy

    I haven't tested, but that won't run. The runtime type is still determined by the constructor that was called.

  • jwenting
    jwenting Member Posts: 4,864 Gold Badge

    let's hope then that they've not decided to turn interfaces into valid runtime types because they now have implementation in them...

  • jwenting
    jwenting Member Posts: 4,864 Gold Badge
    baftos wrote:
    
    Default methods don't seem to do anything to fix this problem, so my question still stands: cool stuff for backward compatibility, but how should we design new interfaces?
    

    Were I to write code conventions for a team/project/company, they'd not change from what they were before: strict separation of definition and implementation.

    So no default methods and no constant definitions (which have been legal for quite some time) in interfaces.

    Helps that all we do here has to be compiled to Java6 language level and class file format or it won't work on our infrastructure

This discussion has been closed.