cross-platform c++ classes?
Rob Wehrli
plug-devel@lists.PLUG.phoenix.az.us
Mon May 21 15:31:01 2001
Lucas Vogel wrote:
>
> Hi Rob!
>
> The GSD library is, in essence, a single object(as in .dll or .a) library
I'd avoid calling a file an object, at least in the context of an OO
discussion. If, instantiated within a body of code, then an object it
is else !object. :)
> (written in C) with a list of a couple hundred function calls(API's?) that
> you use to do different things with. The documentation(which is very poor
> indeed) splits all of these APIs into the following libraries:
When you say "following libraries" do you mean .o files? (Compilation
units) Or are there substantive numbers of .o files per Draw/Plot/etc.?
Why not just send me a pointer to the GSD package if it is Open Source?
I can take a look, run it through some autodoc code and see if the
documentation is at all meaningful.
>
> Draw
> Plot
> Message
> Utility
> Attribute
> Attribute Conversion
> Database
> Data View
> Field Data
> Hierarchy
> X Window Symbol Palette
> X Window Frequently Used Symbol Palette
> SputSget BE
> SputSget LE
> X Window Draw
> X Window Utility
> NT Draw
> NT Utility
> Schema
> Menu
> X Window Attribute MMI
> X Window FreeHand Toolbar
> DMA Conversion
> Geo
> Parse
> Overlay
> X Window Database MMI
> X PixMap
> Validate
>
> What I want to do is create a set of C++ classes that will wrap around these
There is at least a dozen ways to approach this "topic." Just about all
of them stink in some way or another and certainly all are open to quite
a bit of opinionated debate regarding which is a "better" approach to
take. I'd probably write a "global" wrapper or "mapped wrapper" to the
entire thing stubbing out in a really big and ugly class all of the
function calls and pointing the stubs to the C implementation while
trying to decipher where the class relationships exist and need to be
implemented. That way, you can create a pointer to the global wrapper
and call its method any time your "not-stubbed" implementation calls
fail. There are at least a million different ways you can cut off your
arm in doing this, but at least you'll be able to trap it when you do
:) (using try/catch blocks)
> api calls so that I have actual objects to handle the data. I initially was
Remember that an "object" is an instantiation. Sometimes, really large
objects are "bothersome" to instantiate.
> In a nutshell, the purpose of this library is to create a base set of c++
> classes that manipulate this GSD library to make it easier to understand and
> use. The library will hide the API calls to the GSD library and allow me to
> inherit from a set of classes into more specific applications, such as the
> COM wrappers for Windows, while still giving me a library to use should I
> try to use the library for something on Linux. To use your words, I want to
> add to its grace and beauty.
It sounds like a nobel enough goal. The key becomes to implement those
classes that are needed to simply the use and understanding of the
"legacy" library while calling into the legacy code as needed from the
new implementation. This is an area where a really sweet use of
interfaces can be extremely helpful, but can also serve to hinder the
understandable portion of describing your new implementation to others.
However, such an implementation serves wonderfully for your eventual COM
implementation, but don't get carried away too quickly. Remember the OO
programmer's creed. Build the simplest thing that works. Naturally,
follow the rules and leave blatant "hack" shortcuts and obscured
(remember the code sample I sent) implementations alone, but build the
simplest thing that works.
>
>
> I am trying to take a library that is required for use and hard to
> understand and place a well-written, neatly-organized, well-documented,
> comprehensive set of classes on top of it. From that I want to create a set
This is very nearly a "too" altruistic goal. Remember GIGO? If what
you're building on is a "shaky" foundation of poorly documented code,
then perhaps a rewrite is a better choice? (Often impractical, huh?)
Steal the algorithms that make sense (giving the author credit, of
course), but don't build too much dependency on something that is
unmaintainable and poorly documented itself...where will people *have*
to look when your documentation doesn't answer their question...? It is
almost always harder to rewrite someone else's code than to create your
own code using their's as a "plan." Just be careful that you don't get
bitten by the cut-n-paste bug and end up with something that you don't
understand its workings and suddenly, it won't compile no matter what :)
> of well-written, neatly-organized, well-documented, comprehensive COM
> objects that inherit from those classes and create a very powerful and
> easy-to-use library for handling the GSD functionality.
I'd caution against thinking "library" for now. Think of a library as a
place where thousands of books are stored. Each book has some
interesting knowledge or "use" to give to its user(s). It is an
*archive* of information. An API is the "card catalog" of the library.
Which books are located in which sections and what their "titles" are
and so forth is more or less the API. What you'll end up having is
possibly a supplemental library with a dependency on the legacy library
or possibly a standalone library to a new or at least "superset" of
existing API. I say "superset" because it really should be a superset
(inclusive of all of the legacy API, for backward compatibility?), but
it doesn't necessarily have to be, especially if it is not a standalone
library. Start thinking about the API that you want to implement. What
new information do you want someone to "read" in your "book?" Worry
about packaging after you've got a better handle on what the content
looks like....kinda like trying to figure out a "bottle" for a package
and then trying to get a "ship" of content into it. While the practical
implementation and result of your work is likely to be a library or
other executable code storage mechanism, let's not start buying
bookcases just yet. Let's write a couple of books, then we'll start to
see what we need in terms of storage. What you'll probably find is that
your storage mechanism changes slightly as a result of the target
platform, so it will have to come a bit later somehow.
One of the things that I like to try to share with other programmers is
to build what you know you know. Don't start building too much of what
you know you don't know. The more you get involved with the project,
the more you're going to learn and understand about what you're doing
(whether a veteran or a newcomer!). It is a lot easier and probably a
lot better to build based on what you do know and as you discover more
of what you need to learn, build as you go...rather than trying to
figure it all out up front. Just as soon as you get it all figured out,
the world will have changed and thrown a wrench into it anyway and
caused you to waste 90 days *thinking* about something when you could
have built at least half of it by then, if not all of it by just doing
what you know and learning as you go. Sometimes, sure, shit happens and
you have to trash a bunch of work because, hey, "we didn't know!" But
the beauty of the "process" is that you will have learned SO MUCH MORE
by having gone through it than you would ever learn by "speculating" and
"what-iffing" it to death without getting in the trenches and just doing
what you knew at the time. And let me tell you, if you do have to throw
something away, writing it the second time but better is probably a 10x
or more speed advantage...because you already know what you want and
because you already know what doesn't work and, more importantly, WHY it
failed and what to avoid. The traditional "waterfall" approach and
"front-loaded planning game" is nearing obsolescence...at least in terms
of practicality! It still has a long way to go before it dies in the
minds of millions of people around the world. If you need any better
analogous example, just look at the Internet. How fast has it changed?
Isn't it changing at roughly the "speed of light" as we speak? How many
other people "shared" today because of it and how many people were
impacted because of it...whether or better or worse? The faster our
world changes, the more important it is that we can adapt and overcome
the challenges faced with keeping pace with the change. If you're so
inclined, read Extreme Programming by Kent Beck. I went to the
first-ever "immersion training" on XP with Bob Martin, Martin Fowler,
Kent Beck and a host of others...a couple of years ago.
Don't let anyone tell you that you can't be a great programmer because
you're just starting out...but realize that you'll never catch up, too
:) If I could somehow capture all my knowledge in a single container,
it would appear to be less than a single star in a full night's sky of a
million galaxies. That is how quickly and how rapidly our world is
evolving right now as our fingers whack out new electrons to slam into
the bit bucket when California's grids go down :)
> > HTH...
>
> It does.
>
> Lucas
Glad to hear it.
Take Care.
Rob!