Arguments against Hungarian Notation

Rob Wehrli plug-devel@lists.PLUG.phoenix.az.us
Fri May 3 10:35:02 2002


>
> At 07:30 AM 5/3/02 -0700, you wrote:
> >Okay I *long* ago decided to give up on Hungarian Notation. (I think
> >only VB programmers seems to like it). I now find myself in a shop with
> >a *lot* of previous VB programmers who are converting to Java. So I want
> >them to stop using hungarian notation. So far the only argument against
> >it I could remeber is that it violates OO abstraction. But I *know*
> >there are better arguments against than that. Anyone have any sources or
> >know of any good reasons against?
> >
> >Carl Parrish
> >

What?!  Violates OO abstraction?  That is TOTAL BS...whether you like 
Hungarian or not...and, FYI, Hungarian notation was used in C long before 
VB was a reality.

You may want to reconsider why it is GOOD to use it...rather than "give 
up."

I just love those programmers who write the following:


int fa( int a, int b, float f, char * c, char ch )
{
  int *aa = &a;
  while( (ch=*c)!='\0' )
  {
    switch( ch )
    {
    case 'A':
    case 'a':
    case 'C':
    case 'c':
    case 'L':
    case 'l':
    case 'Z':
    case 'z':
    case 'T':
    case 't':
    case 'G':
    case 'g':
      (*aa+=b*f)/ch;
      break;
    case 'N':
    case 'n':
    case 'E':
    case 'e':
    case 'O':
    case 'o':
    case 'F':
    case 'f':
    case 'S':
    case 's':
    case 'B':
    case 'b':
      (((b*b)+*aa)/f)*ch;
      break;
    default:
      *aa = 147;
    }
    printf( "%d%d%f%s%c\n", a,b,f,c,ch );
    c++;
  }
  return *aa;
}

int main( int argc, char * argv[] )
{
  char *s = "The lights are on but no one is home!";
  float fb = 0.78F;

  return fa( 112, 4, fb, s, 'a' );
}



...of course, it is *never* _this_ straight forward!  After wading for 
hours through the various files to try to figure out what the hell 147 is 
supposed to mean...we end up with trying to decipher what:

  printf( "%d%d%f%s%c\n", a,b,f,c,ch );

...is supposed to do..well, at least in printf we can see the formating 
"types" and figure out what the types are...

BUT...the people who DON'T like Hungarian tend to be those who continuously 
cast away types or don't use them properly...for example:

typedefing everything to a void *.  Don't laugh.  I've seen it in droves! 
 Now, only the code and programmer can ensure that we're getting the right 
data...whatever we do, let's not use the compiler's ability to check!

Arguments for Hungarian far outweigh the lazy programmer's weak (and BOGUS) 
excuses like "violates abstraction!"  Please, pray tell, how does *this* 
violate abstraction?

class Rect
{
  public:
    Rect::Rect();
    Rect::Rect( int i_x, int i_y );
    Rect::Rect( const Rect &rhs );
    Rect & Rect::operator=( const Rect &rhs )
    bool Rect::operator==( const Rect &rhs );
    bool Rect::operator!=( const Rect &rhs );
    Rect::~Rect();

    int Rect::GetX();
    int Rect::GetY();
    int Rect::SetX( int i_x );
    int Rect::SetY( int i_y );

  protected:
  private:
    int m_iX;
    int m_iY;
};

...abstraction is hiding the details of the implementation, not "obscuring 
the interface so that the variable types are hidden" ...in fact, once your 
"interface" is defined, and used, the "named variables" are only a 
convenience for the programmer to remember their types further into the 
code than where they are "declared."  This is especially helpful in *C* 
programming, where you must declare variables first and spaghetti your way 
into it from there.  In C++, it is actually *preferred* to declare 
variables where "first used."  That keeps them better scoped to their 
purpose and helps keep them fresh in the minds of programmers where they 
are being used.  Hungarian notation is simply a method of helping the 
programmer remember the TYPE of the variable.  As Alan says, any "style" of 
programming that is CONSISTENT is preferred over some of the tons of 
"other" crap out there.  Indenting and uses of braces, whitespace, etc..can 
all easily be "fixed" with code beautifiers.  Variable names are "stuck" 
with whatever happened to suite the whim of the programmer.

Some time ago I posted a message to someone about an OO Blackjack game...in 
it, I said something along the lines of the hardest thing to do is come up 
with good names for classes (and variables!).  Hungarian helps take the 
"guess work" out of types, even though most modern GUI editors can do that 
for you by hovering the mouse over it...whether under Windbloze or 
Linux/UNIX.  However, for those of us who don't want to go fetch the mouse, 
but remain "on the home row," Hungarian notation is an important and 
justifiable tool.  Once you get used to it and use it, everytime you look 
at "unnamed" code, you'll cringe.  With Hungarian, you can quickly and 
easily figure out what another programmer is trying to do *regardless* of 
his religious preference for braces/whitespace/etc.

...of course, then, you'd have to *care* about type information...coming 
from the entire Ruby experience where types are "dynamic" ...I can see how 
that would be a difficult change...

One thing that Alan said that is for certain, the negativity associated 
with a non-conformist/extremist attitude regarding styles is at least 
politically important.  There are also enough "respected" programmers' 
opinions out there to weigh in favor of it...to make a dissenting opinion 
sound out of place.  I'm not trying to tell you to get in line and follow 
the other lemmings off the cliff.  I would encourage you to write better 
code...and, using Hungarian (or some other variable naming convention that 
helps keep track of data types) could be helpful in obtaining that goal.

If you *really* think that a variable naming convention "violates 
abstraction" please...let's hear it.  There really isn't enough humor in 
code these days...totally preposterous!  Like the difference between naming 
an integer "int age" and "int iAge" are relevant to the interface 
constructs!  Didn't the interface DECLARE that it must have a type of int 
on its call?  The name is only useful as a unique (within the appropriate 
scope) symbol and as a visual reminder to us that we're passing some data 
that is in someway meaningful to what we're doing with the code.  The 
interface could easily use "int x" ...in fact, there is no real difference 
between them, only that "x" is less meaningful than "age" or "iAge" to the 
READER of the code *if* some value, which is considered to be the "AGE" 
data is to be used.

Take Care.

Rob!