One of the popular ways to explain inheritance in object oriented programming is that is represents an "is a" relationship. This can be true, but often it isn't.
An example that Robert C. Martin likes to use is that in the real world a square is a rectangle, but it makes no sense to model that relationship with inheritance in an object oriented system.
I came across another example the other day when working one of my hobby projects. The project is an ORM and I was adding basic convention support. You can specify your own conventions by implementing an IConvention interface. However, often you might not want to redefine all conventions. Instead you want to override only the ones you need to.
To enable this there is a DefaultConvention class. You can derrive from this class and override the conventions you need to modify. So, inheritance is used as a means to solve a problem, but if you think of that inheritance relationship as an "is a" then it makes no sense.
An overriding convention is NOT a default convention. It is actually the opposite of that.
So, be careful when thinking about inheritance and think of it as a programming specific mechanism for coupling classes rather than a model of an "is a" relationship from the real world.