Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2002-901
$eventId
ID of event: wwdc2002
$eventContentId
ID of session without event part: 901
$eventShortId
Shortened ID of event: wwdc02
$year
Year of session: 2002
$extension
Extension of original filename: mov
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC02 • Session 901

Command-Line Development Tools

Tools • 1:10:25

Investigate the command-line development services available in Mac OS X with the installation of the Mac OS X Dev Tool package. We discuss and demonstrate how to use the Terminal and GNU tools for compilation and debugging. Differences and similarities to other UNIX-derived systems as well as specific Mac OS X development concepts will also be presented. Developers should complete this session with a working understanding of the command line tool environment in Mac OS X.

Speakers: Stan Shebs, Sean Eric Fagan

Unlisted on Apple Developer site

Transcript

This transcript was generated using Whisper, it has known transcription errors. We are working on an improved version.

Ladies and gentlemen, please welcome Technology Manager for Development Tools, Godfrey DiGiorgi. Good afternoon. It's good to see so many of you here today. I hope the show has been running well for you so far. Yesterday in our development tools overview, we talked about the fact that we build on standard technology and standards.

And many people new to Mac OS X have been encountering the command-line tool set of UNIX for the first time. Just out of curiosity, let's use the applause meter. How many people here are coming to Mac OS X from a UNIX environment? It's about there. How many people are investigating the UNIX tools from a Mac OS environment? That's about there. That's about what I expected. So without further ado, let me introduce Stan Shebs, our senior engineer in core tools. And he's well known in the open source community.

Hello, everybody. Does this-- Can everybody hear me? Okay. So... Wanted to start off by going through the basic thing we're going to be talking about today, which is the fact that Mac OS X has a full set of command-line tools. We use these tools to speed our OS X ports, especially coming from UNIX. And what it does is it gives us, the command-line tools also give us a familiar place for UNIX applications, both for compiling them and then for running them once we've compiled them.

So the specific topics, I'm going to go over how to build and debug using command-line tools, talk something about what the tools are available on a standard system with the developer tools installed, talk about some of the unique features of the command-line tools, go a little bit into porting issues, although not in great depth because there's another session for that, and then also to talk something about the documentation resources that are available.

[Transcript missing]

Another thing that having the command line available on Macs now is that it gives us a lot of leverage. 30 years of development on UNIX from everything from servers to scientific applications to educational things and so forth, there's actually an amazing number of tools that have been developed and delivered, both large and small. And so a lot of that stuff will simply just work on Mac OS and Mac OS X. And this is actually the first time that things just worked on Mac OS X.

So to get started with all that, we have a terminal program. And this is an application. If you go to Applications, Utilities, there's a terminal icon in there. It's also located at /applications/utilities/terminal.app, but of course that doesn't really do you much good because that's the command line path and you can't type open /applications/whatever until you have a terminal window. Anyway, so that's just to show you where it is and you can actually CD down to that and just see what the terminal program is made up of.

So Terminal, how many people here have not seen Terminal or not used Terminal, let me ask? How many people have not used Terminal? Okay, good. So everybody's actually seen all this and this is all redundant. Anyway, in case you didn't know it, you can open up multiple windows. Each window runs a separate instance of the shell, and I'll talk about the shell more in a moment. It gives you lots of preferences, and it is available on all systems.

So you can go to somebody's strange OS X system, whether or not it has the Dev Tools installed, and Terminal will be there. So now Sean Faagan is going to give us a demo of the Terminal program. Not that anybody needs it, but Sean has a few tricks up his sleeve.

and John I can hear that, okay. I had hoped that there would be fewer people who didn't know the Terminal application. I'd give a spiffy demo of it, but you all know about it, so... Here's where you get to it. It's under Utilities, Terminal, and everyone knows about Console too, right? and All right.

Let's make the font larger. I'm going to be using this later so I have to set it up anyway. The display, set font to something larger. Let's go 14 point. How's that? Can everyone see it except for the windows in front? Great. The other thing I was going to do, there is, as Stan alluded, there is the open command, which is kind of like double-clicking something. You do that, finder brings up the root directory.

And the other thing I was going to show is people get a big kick out of setting the terminal transparency to, well, something semi-transparent.

[Transcript missing]

Okay, so let us all recite the catechism. We say emacs foo.c. We cat it out just so you can see what you did. We say cc.

We do add a dash g because we're going to run the debugger in a moment. We say a.out because by default the name of the executable is a.out. We put dot slash because we want to run the one in our current directory and not some other a.out that might be somewhere out in the system that our search path is going to find us.

[Transcript missing]

and that's pretty much all you need. That's the command line tools. So now we're done and you can all go out in the halls now and tell everybody about the amazingly short command line tool session.

Okay, but nobody's getting up, so I'll go into a little bit more detail on some of the things. So the shells, the shell is basically a command interpreter. It's just a traditional UNIX-y term. One of the curiosities of UNIX, shall we say, is that there are a number of different shells that have been developed over the years.

Since a shell is just a normal UNIX program, it's actually possible to have many different shells running at the same time on a UNIX system. And so people take advantage of that and they invent shells. There are shells written in Scheme. There are shells that use funny syntaxes.

But there's two main families of shells that you'll find, and both of these are installed by default. The original family is called the Born Shell after the Bell Labs guy that originally put it together, and it has a specific syntax that you'll hear referred to as the Born Shell syntax. Since it was installed as bin-sh, you'll also see it referred to as just bin-sh.

Now, it turns out we have a couple implementations of that. One's called ZSH. That's the default for 10.1 and earlier. For 10.2 and later, for Jaguar, excuse me, Jaguar and later, we have Bash. Bash is the GNU project's reimplementation of the Born Shell. It's short for the Born Again Shell. Ha, ha, ha.

Anyway, it has, among other nice properties, it's quite a bit faster than the ZSH. The syntax, though, that is not necessarily to everybody's taste, and there's another family of shells that people have used, which is called the C shell. Originally comes from Berkeley Unix. And the incarnation that we use on OS X is called TCSH. And it's the one I personally prefer.

In general, the commands that's over look the same between the two. So in some ways, a matter of taste is, how do you like to do loops, and how do you like to quote things and so forth. So you have to be kind of an aficionado maybe to decide which one you really prefer. Usually most people it's good enough just to pick whatever you get by default.

[Transcript missing]

Since I'm up here and I had to put together the slides, I got to decide which one was going to be the one true editor. So we'll talk about Emacs first. Emacs is an editor that's been around for a very long time, since actually long before UNIX, well, not long before UNIX is, but somewhat before UNIX existed. And it was actually originally developed on ITS machines at MIT.

That's how old it is. It's highly customizable. It actually has a Lisp-like language inside that you use to program its behavior. And people have used that over the years to build in quite a lot of good features. For instance, Emacs' knowledge of C syntax is actually pretty good.

So if you hit a tab to go to a new line, and the new line doesn't jump over to where you think it should, you've probably made a mistake in your code somewhere up front. And it's best to, instead of arguing with Emacs, to go back and find out where you typed the extra brace, or forgot to close a comment, or what have you.

So on 10.1, we have the standard version is 20.7, and I believe it's something like a 20.8 or something like that for Jaguar. This is a single window editor, so it just sits inside a terminal window. It doesn't open up multiple windows of its own. There's an X11 version that's downloadable also, but you have to, of course, be running an X server for that. So because we value diversity, we also provide VI. You know, that's interesting. There wasn't any applause for Emacs.

This is bad. I've got a tough audience here. I have a good story to tell about VI. I actually used VI full-time for about a year, about 20 years ago. Then I switched over to Emacs and have been using it ever since. Coming back to Apple, every so often I get stuck using VI for whatever reason. After 20 years, I still remember all the VI commands.

So that's a good thing about VI. The commands are easy to learn. Things are fairly straightforward. It doesn't have all the flexibility. Now a couple of people at Apple said, "Well, what about Vim?" Okay, there is an improved version of VI called Vim. It has a lot of this customizability, but you have to get it by downloading. It's not in Jaguar.

Other than completing, I'll mention we have things like Pico that's also installed in the system. Again, there's literally dozens of editors available for the download if there's something else you prefer. And Sean has a demo of Emacs. This is just a quick and simple setup. It is a UNIX system. Write a file. Use real paths.

Everyone knows this program, and unlike Stan's, I have a proper return value. That's because those of us who use VI know to follow standards. There's the compiler. It's awfully quick on this machine. Head on out. And for people who like this. And we can say... Come on, where's the applause?

[Transcript missing]

and David There's also an Aqua version of Emacs which was done from the Next Step version. Unfortunately, I didn't bring it so I can't show that.

Thank you, Sean. So if you send your email address to Sean, he will personally send you a copy of this cool Emacs that I was hoping you would demo. This would actually be one that would be a better Emacs to use. If we can get that folded into the system, I'm hoping we'll be able to do that in a future version. The multiple windows, Emacs with running inside a terminal window is convenient. You can run it over the net. You can do all that kind of good stuff. But having a multiple window Emacs with the buffers and mouse sensitivity is actually a really good thing.

Okay, so we'll cover a little bit about languages. We have, of course, C, we have C++, we have Objective-C, and we have Objective-C++. We have Java. This is based on Sun's Java 1.3.1. There's a full set of the command-line tools, jikes and so forth, that you can run. We have a number of interpreted type languages. We have Perl, Tickle, PHP, and in Jaguar there will also be Python. For 10.1 you'd have to download it.

The third parties also have command-line versions of their tools. There is a command-line version of MetroWorks and there's command lines for for ABSOX Fortran. In addition, you can download G77s, the Fortran compiler. How many people want it to be a standard part of Jaguar? It hasn't been decided yet, but it's a commonly requested item, and so it's quite possible to still be able to do that.

So there's also some nutcases that have been working on Ada for OS X. Where are the nutcases? I know they're in this audience somewhere. and It actually is working just as a very, very short time ago. They're actually starting to work on how to get it integrated into Project Builder. But the command line stuff is there already. Probably not going to be a standard thing in Jaguar. We can also download versions of LISP and Scheme and so forth. There's quite a few languages that have been ported to OS X.

So our build tool options, as Sean alluded, there are basically two kinds of makes. GNU make is a standard. There's also a BSD make, but it's pretty much for compatibility. And actually, I personally don't know of any tools that require the use of BSD make, but I know that such exist. There's also a command-line tool called JAM, which is the underlying make-type tool for PBX build, which is a command-line version of Project Builder.

So it sounds kind of odd, you know, why would you have a command-line tool for Project Builder? Well, it has an advantage in a work group where you say you have some people using IDE and some people using the command line, or if you have a situation where you need to run, say, a nightly script, and you don't want to have to make your most junior guy come in and click on the mouse in the middle of the night to get something to run.

So, you can also SSH into a machine and run PBX build there. So, PBX build is very convenient. The Project Builder projects are actually in a text form. They're XML syntax and they're textual files. And the truly brave can actually edit them manually, but officially we don't recommend it.

But if you're truly brave, truly brave. Okay, I should also note that Apple's Jam is extended from the net version to support PB. So, if you download a net version of Jam, don't expect to be able to drop it in on top of the Jam that we supply.

Other useful tools, we have about 400 or 500, I think in Jaguar it's actually closer to over 500 now, tools in slash bin and slash user bin. And the collection of tools most resembles FreeBSD. There's CVS, RCS, and SCCS for source code control. CVS is perhaps the most commonly used of those. There are a lot of projects that run on the net, open source projects will use CVS over the network to manage their code.

For people at old UNIX types that use the old TROF, NROF formatting tools, those are actually all still there. At this point, their main use is to format the man pages. I'm not sure that anybody actually writes new stuff in them outside of man pages. But they're all there if you love that kind of stuff. We have Lex and Yak for compiler generation. We have the GNU equivalents Flex and Bison.

We also have a lot of tools that are more expected to manipulate object files, do fairly low-level things. There's RanLib and AR, which I'll talk more about in a bit. The assembler is called AS. It generally resembles the GNU syntax, but it's evolved somewhat, so it's a good idea to check the OS X assembler manual if you're getting into assembly code.

For the old Mac heads, the syntax is somewhat different from PPC-ASM, so you should be aware of that. O-Tool is the tool we use instead of ObjDump to dump out object files to get the contents of them. And you can do man-o-tool to find out more about it. We also have things like AutoConf, which I'll also talk more about.

Okay, shifting gears a little bit to talk about APIs. We implement most of POSIX, okay, but we don't implement all of POSIX, and we want you to report POSIX nonconformance wherever you find it. Now, POSIX applies to both command-line tools as well as to programming interfaces. So, for instance, if LS-L doesn't operate as you would expect it to do on a different UNIX machine, you know, that is a bug in the command-tool side of the program. of the POSIX specification. and then the anything that doesn't operate like what a quote normal UNIX machine operates like would be a non-conformance in the POSIX API.

and So we'll get to POSIX eventually, I think, but it's certainly not the case in 10.1, it's not the case in 10.2. Other APIs, we do OpenGL and GLUT. From personal experience, I think things come over quite readily, very little change required. We also have Carbon and Cocoa for GUI, which, of course, you've heard nonstop about here.

There's also a different API, which is I/O Kit, that's used for drivers and kernel extensions. And I/O Kit is a C++ object-oriented thing. It's actually kind of interesting to use C++ at the device driver level. It makes it possible to write very small drivers and be able to share most of the code.

And for anybody that's ever tried their hand at a Linux device driver, you'll really appreciate that. We provide a lot of the common code, and you only have to write the stuff that's unique to your device. We also provide, not as a standard part of OS X, but we have a couple different X11 systems available. So if users can run a server, then X11 apps can run on those machines also.

So APIs are provided by what are known as frameworks. And I have to apologize for doing a graphics on this. This is the only graphics in the talk. Command line guys don't need no stinking graphics. But this one looked good and I said, oh sure, why not. Okay, so a framework is basically Apple's answer to the problem of managing shared libraries and headers and resources that go along with those libraries.

So what it actually looks like on the system is you'll have something for a framework called Foo. You'll have a Foo.framework. Inside that, you'll have the shared library proper, which will usually just be called Foo. And if you did the file command, the file command tells you what's in a file.

And the file command says this is an executable. Well, in some sense, it's an executable, but you don't execute it by itself. It's just a shared library. That's just how it happens to be marked. There's also a subdirectory called headers. All the headers sit in that, and there's a subdirectory called resources. And the resources are things like strings that you might localize or other information about the framework. And these are all text files, again, in XML syntax. And you can edit them if you're feeling brave.

So, another detail about frameworks, if you actually go wandering around looking at the framework, you notice that a bunch of stuff is symlinked. And when I say headers there, headers technically is a symlink to versions and then say A and then headers. And this is a versioning mechanism that frameworks have that in practice is not really used very much. The most you'll notice is when you try and look at the headers and it says, oh, well, this is just a symlink. And then you have to say, look at headers slash, and that'll show you what's inside. So that's just something to know about frameworks.

To use them, you'll say include foo/foo.h in your source code. I'm sure some of you are looking at it and thinking, "Isn't that confusing if you had something like user include foo/foo.h?" Actually, yes. There is an ambiguity there. They chose this framework syntax before I came along, so I would have told them to use something like foo;foo.h or something that wasn't so obviously a path. But anyway, we luck out most of the time because user include doesn't have a foo subdirectory in it, so this thing doesn't get confused. Is that some of that capital letters? Yeah.

uppercase, lowercase, that's a non-issue. It's basically a convention thing.

[Transcript missing]

Most of the frameworks live in /systems/library/frameworks. There's about 40 or 50 of them, I guess. However, most of the system stuff, the standard UNIX system stuff, doesn't live in the frameworks. That's the normal headers, POSIX headers, live in slash user include. They look the same as always. If anybody remembers the public beta and earlier stages, there was actually a, quote, system framework. And all the headers actually lived in system library frameworks, and user include was sort of symlinks.

In practice, that turned out to be kind of confusing, and so we essentially ditched the whole system framework idea. And the only remnant now is that if you look at the user lib, you'll see a reference to, or you'll see a copy of the file user lib lib system dot a, or sorry, lib system dot b dot dilib or something like that. And that is the standard C library and all the stuff that goes in it. And it still has the syntax as if it were a framework.

Anyway, to define your own frameworks, you need to, well, you can stick them in system library frameworks so they can be erased by the next OS X software update, or, I'm sorry,

[Transcript missing]

So moving on again. So the compiler is GCC. And we get this question a lot. I think pretty much everybody's worked it out by now. But the compiler is GCC. You do not need to download from the FSF and try to build GCC in order to get started because you'll almost certainly be disappointed.

We have a lot of extensions to GCC. Some of those extensions, like the framework lookup, are fairly important for doing anything on OS X. So whether those things will make it into FSF GCC, we don't know. It's kind of a thing unique to the Mac. And the rest of the world isn't necessarily that enthusiastic about adopting this stuff. But by and large, the usual GCC options work.

We do have a number of additional options. Most of them are distinct to Mac programming, so you won't need them for the porting of UNIX applications. And for Mac applications, most of that stuff will have worked already. That is, if you used it previously on OS 9 application, Pragmas and so forth, they'll work pretty much the same on OS X.

I just wanted to call your attention to a couple interesting options. We have a -f altavec that enables all the altavec support. By default, because altavec actually makes the symbol vector into a keyword. If you had a program with the variable called vector, you turn on f altavec and it kind of chokes. We made that an option. You have to actually explicitly request -f altavec. to VEC.

So, if you're porting anything, and there's probably not too much of this code, if you're porting from a PowerPC Linux where they had a Motorola version of the AltaVec support added to GCC, they called that option dash FVec. And we renamed it to AltaVec because that was a little bit more accurate. And so that would be what you'd change the FVec into.

Another option in the compiler session we'll talk about is some more depth. There's a -mdynamicnopic that you might want to add to when building applications. It saves a couple of instructions, but it makes your application not relocatable. This is fine for applications. You wouldn't want to do this all the time because the shared libraries you actually want to be able to relocate around an application's address space as the dynamic library gets loaded and unloaded by the OS.

So to build libraries, it depends on whether you want to do a statically linked library, a traditional archive format, or a dynamic library. To build a static library, you use AR, another old UNIX program, and then you use RanLib on the result to make that into a, essentially to add a table of contents. This is something that you don't normally do on a Linux machine, so that's a little bit of a departure for people that are familiar with Linux.

The other way to build a library is to make it a dynamically linked library. That's the library that underlies the framework. And there we use the option cc-dynamiclib. and then all the right things happen behind the scenes. Now, one way to cover this up, and quite a few heavily ported applications know how to do this, there is a system of shell scripts called LibTool, which tries to give a platform-independent mechanism for building libraries.

One of the unfortunate things about the many Unix variants in the world is that while they all call their compiler CC and they all use dash G to say output debugging info, somehow that never quite happened for building of shared libraries. On one system it's dash R, another one's dash shared, for us it's dash dynamic lib, etc. The LibTool is basically a wrapper, and it's a shell script, that wraps around the compiler and the linker invocations to pass the correct kinds of options for the platform you're running on.

So it's a very handy thing. For porting, the main thing that you have to worry about is updating the lib tool bits. There's a standard that comes from the GNU project, GNU.org, and most projects have fairly up-to-date lib tool bits and they know about Darwin OS X, but I've seen a couple where they've been out of date and the right thing to do is to update.

and speaking of Darwin, another thing to know about is the different versions of the system. You've heard this a bunch of times, lower layer system is called Darwin. Darwin basically consists of the open source part, few or no frameworks, I think core foundation is probably the only framework that's actually in Darwin and none of the GUI stuff.

This matters because there's two different kinds of tools that you use to get version numbers. One of the tools is called SW_VERSE. That reports the Mac OS X version, such as 10.1.3. It also, underneath, reports a build number, something like a 5P64. That's the way to identify an exact build image. Most of the time that really only matters to people, say, getting seeds where it's different than the final release number.

The other tool, and this is a UNIX standard tool, is called UNAME. You can say UNAME-A and it will give you the Darwin version number. There's a little chart showing how they match up. Mac OS X-O is Darwin 1.3.1, 10.1 is 1.4. Then, for reasons that seemed good at the time, but they still seem good, we bumped the numbers so that the software update to 10.1.1 changed the Darwin version number to 10.1.1.

There were a few people surprised by that, let me tell you. Not often you get a major version number change by doing a software update. Anyway, the purpose of that actually was to align the Darwin version numbers with the build numbers used internally. So when I say 10.1 is base 5P whatever, the Darwin is a version 5 to go along with that 5. So to take an example, in Jaguar, your systems, you name dash A, will say it's a Darwin 6.0. And do along with 10.2 as an OS X version number.

The reason this matters is because it ties into one of the very common ways of doing handling ports, which is to use GNU's configure and make process. and There's a lot of stuff to go into here. We don't really have time to go into it in any depth.

Makefiles are basically not adequate for really large or complicated software projects. So what the GNU project adopted years ago was a process of editing the Makefile to adapt it to the system. And the shell script, which started out as a couple-page shell script, and is now like 100 pages long if you ever printed it out. But the configure script will essentially edit the Makefile and take care of any system dependencies or user-chosen options or what have you, and edit the Makefile so that the right thing will happen for the system.

and this is why in many of these projects you can actually pull the source code down off the FTP site, take it on to your OS X system, even if the code has never been ported before and many of them just work. Configure knows how to dig around and user include. It knows how to turn if-def's on and off depending on what a system has on it. Edits the make file appropriately. When you run make, all those things are taken into account and the program produces the executable.

So one of the ways, though, in which that could fail is if the guessing part about what system you're running on is incorrect.

[Transcript missing]

and then later on the configure script can then make a test based on that. Is this a version that supports a feature in C-type.h or something like that.

Anyway, config.guess for many projects nowadays is up-to-date enough to recognize Darwin as a platform, but not always. And if yours is one of the unlucky guys, then you want to get the two scripts, config.guess and config.sub, from the GNU organization. And I put there the URL that actually gives you to the CVS web where it shows you the latest version. Since these two scripts are self-contained, it's generally always okay just to take those scripts and drop them into an existing project.

Another thing to help you get started on porting, some projects do want to see GCC. They actually invoke GCC by name. It's convenient to symlink CC to GCC, C++ to G++. This will be done in Jaguar. If you look at Jaguar, it's already got those symlinks, so this is strictly a 10.1 issue now.

Now the other thing that people often ask about is they want some kind of predefined macro. And we don't have a predefined macro. Partly the reason is historical, partly it's a policy thing. The macro that we would use would inevitably get confused with, say, different generation of Next Step, which doesn't really matter much anymore, but it did up until fairly recently, and other versions of Apple's systems as well.

So in general, what we recommend is either to test for the features directly, and I'll talk about that more in a minute, or to essentially insist that the user add a dash, a define of Mac OS X as a symbol and then use that in your code if you absolutely have to do it that way.

So there are interfaces that have to be ported, despite all the command-line emphasis. Curses just works. Curses are built into the system framework. It's already all there. I haven't personally experienced any problems with bringing over a piece of code that uses curses. There are now aquified versions of TickleTK and of QT. So you can actually write a tickle script that brings up the lickable buttons and and all that good stuff.

X11 is available if your users are willing to run an X11 server. In something like a university environment, that would be a very reasonable thing to do. Those are available. The distributions are commonly readily available. There's X386. There's a commercial package from Tenon.

[Transcript missing]

The OpenGL framework is called OpenGL.framework, which means you include OpenGL/GL.h. So one way is to change the source code if def every include of GL slash or the other alternative might be to actually add a GL subdirectory to user include and make symlinks from there to the OpenGL.framework headers.

Some more common problems that people run into. The compiler uses pre-compiled headers. These pre-compiled headers win big time. They're like a 4x speed improvement for compiling Cocoa things. As it happens, relatively little Unix code that you're bringing over uses Cocoa, amazingly enough. And so you're actually not really getting any benefit from using pre-compiled headers.

And the preprocessor, this mechanism works by using an intelligent preprocessor, sort of intelligent anyway. And it takes extra time to preprocess when doing its thinking about whether to use the pre-compiled header or not. So as it turns out, it basically has no upsides and all kinds of downsides when doing Unix porting. And so we have an option no CPP pre-comp that basically switches back to using GNU CPP.

Other things you'll run into, there's missing declarations. There's some that we know about that are going to be fixed in Jaguar, and there's probably others we haven't heard about, and they're going to be fixed in whatever comes after Jaguar. Ditto for missing functions. One way to deal with that is to re-implement them, conditionalize them when they're running on OS X. Sometimes you have to if-def the functionality, alas.

Another thing that can happen is extra definitions. The system framework, lib system, has a lot of stuff glued into it. As I mentioned, the curses library is glued into it. If you have a program that can't be linked with curses, that you don't normally expect to link with curses, and so you reuse goto xy or some other curses function, then you'll get something back from the linker saying, warning, multiple definitions of goto xy.

You're saying, The only one I know of is mine. It is a warning. It's not a fatal flaw. The linker complains because Mac developers expect it to complain. We have an option called multiply define suppress that you can feed to the compiler and it gets passed on to linker and that will stop the complaints.

Another one that I've seen in that same category, we have an implementation in GetOpt, which is used to get command-line options in a standardized fashion. The lib system has a version of GetOpt, and they had a lot of GNU tools link in their own copy of GetOpt, so you had also warnings about duplicates there.

Getting a little bit weird. So how many people knew about designated initializers already? Yeah, that's about what I thought. They've been in GNU/C for about 10 years, and they're basically a way to fill in an array positionally, which is kind of cool. No need to have Cs of zeros trying to get down to the one element you want to fill in with the character code of A or something like that.

and David L. It turns out that C99 actually has picked up this GNU extension and it's now actually a standard part of language, except they added an equal sign. Anyway, as it happens, and for technical reasons that I'll be happy to rant about offline should anybody be interested, our 295-based compiler can't deal with this, and it will give you lots of errors. One way in which we noticed this was in GCC version 3 itself, which uses designated initializers now, and it complained. So this will be fixed in 3.1, so it will no longer be an issue.

DLopen is also something that's occasionally requested. This is to open a dynamic library programmatically. I can't remember if it's a POSIX feature, but it's fairly common among UNIX systems in one form or another. It's not directly available, and in fact, for technical reasons, it can't be totally emulated by the Mach dynamic load stuff. So, in general, what people have been doing is they emulate the functionality, whichever part of the functionality they happen to need.

Another thing that people have bumped into is P-threads. We have P-threads, but we don't have complete P-threads. Or perhaps a better way to say it is we don't have all the P-thread options that all the P-thread implementations in the world have, and so people have run into missing options. Again, sometimes you can emulate, sometimes you'll just have to send a bug report to Apple and say, "Resend it every week until something happens." Feel free. We use eight-digit numbers for radar, so there's plenty left.

We're only up to about 29 million, I guess. Anyway, namespace conflicts. It's always been possible that there are a lot of namespace conflicts because of the way Maco works. In practice, I haven't actually seen as many namespace conflicts as one might expect. In any case, if you run into this, the way you'll see this is that you'll have a function, it'll report a conflict.

There's no documentation on the function it's conflicting with. It's just in some framework. One way to deal with this then is to use the option-2 level namespace, which introduces and makes non-exported functions invisible. Like I said, it happens to some people, it doesn't happen that often. But you'll know it if it does because you have a conflict between symbol names and there's no way to fix it.

[Transcript missing]

So, Sean is going to give us a demo.

Now, I think the way he described it more, he was going to do all of Mozilla and in 15 keystrokes. You rea- I don't think. Alright, what I was just going to show here was something using configure that ran into problems. I don't know if anyone's familiar with it, if anyone used rdist.

Artist is like rsync, which is built in the system, only where rsync pulls, artist pushes. A while ago, I had a request for artist, and I supplied it for someone. It almost but not quite worked. Artist 7 almost not quite worked. Artist 6 just worked. I've already configured this because it takes a while. We do a make, and it runs for a while.

[Transcript missing]

These are what my disks look like. Can people read that or should I increase the size? All right. Setfop, that's 14. How about 18? Oh, there we go.

All right. As you see, the first set of patches are to the configure.in file, which is what autoconf uses to generate the configure script, which you should then run. Most systems, for those who are aware of them, configure comes already. You just run configure. However, if you are the developer, you need to create the script that generates configure.

There were some problems with statfs in the original version, and this does the right thing. The other one worked, it just wasn't as good, I thought. And there's more of this configure stuff. And there was an include of malloc.h. malloc.h is not a standard include file. You should not use it. And lastly, artist used a function called setmode, despite the fact that ANSI described it as setmode. So, there was that.

and John Lennon. Thank you. . . . . . . . . . Anyway, autoconf to generate it. Run it again, and as everyone would see, it just... This did not happen in my previous tests, I'm sorry. But it worked. I do apologize. Anyway. That's generally, I actually had to search for a while to find a well-known program that would not compile. Most things you just run configure and make and they do just work. Didn't I do that? Yes, in fact I did. Oh, no you didn't. Thank you. I am stupid today. Yeah, teamwork is good. Yeah. Wow, you're smarter than me. Voila, it works.

Yeah, so that's an example of how you deal with things that can be--and I did, by the way, send the patches to this back to the artist 7 maintainer and I have not checked recently if he folded them in but they were small so I presume he did. Back to Stan. Thank you, Sean.

So that's a, you know, very nearly a perfect example of what actually happens, including remembering to rerun Configure when you make these kinds of changes. Now, if the thing that Sean did looked like gobbledygook, you know, there's actually a, it is documented how to do this stuff, and that's what I was going to talk about. The tool is called Autoconf, and Autoconf is, how should I say, weird. There's a macro process called M4. The syntax is bizarre.

The only excuse I can think of is that it was developed incrementally over a number of years and so it's more as accreted as much as anything else. But what AutoConf does though is very powerful, which is it essentially directly tests your system to find out what things are present. So for instance, you need to know whether ctype.h is present. ANSI requires an ISO requires that ctype.h be present.

It's not necessarily the case that every machine in the world has one. So what AutoConf does is it basically synthesizes a one-line C program that just says include angle bracket ctype.h, compiles it, and sees what the compiler said. Compiler errors out, guess what? Either ctype.h is not there or it has a mistake in it and you don't want to use it anyway.

So it then turns on a flag, depending on how you can set up things, says something like ctype.h not available. And then in your code you would say something like if ctype.h available, include ctype.h else do something else, whatever it is you need to do in your program.

So the auto-conf mechanism is essentially 100% accurate for the system you're building on. Now, it's not quite so accurate if you need to build on a range of systems. If you're, say, on a Jaguar system and you need to build something that works on 10.1 and Jaguar and future systems, you may have to think a little bit harder about how to make that work. One of the things we have available as of Jaguar is a new header file called availabilitymacros.h.

And this will have a set of macros that essentially define what's available in different versions of the OS. It has macros that are like half a page long and they read something like, if Mac OS X 1.0 but not 10.2, things like that. That's how they literally read. So you can use those to make something that works across multiple versions should it be using features that change from one version to the next.

But anyway, the fortunate thing, the nice thing about AutoConf is that it actually has a book describing it. And it was just published a couple years ago. And it's just a standard book, readily available. In fact, I noticed last fall in Las Vegas in Borders Bookstore, which is not exactly what you think of as the high-tech mecca, okay? Not Las Vegas and not really Borders either. And they had in the entryway where everybody coming into, going out of the store has, and they had large numbers of this book on display for everybody to see as they're going in and out.

And I'm going, you know, there must be a lot of AutoConf experts in Las Vegas now.

[Transcript missing]

This also describes AutoMake, which is a way to synthesize Make files that a number of people are using nowadays. And it also describes LibTool, which I talked about earlier. It actually has a lot of stuff about the inner secrets of LibTool.

Some caveats to go along with it. So, you can be root. Is there anybody that does not know how to become root on OS X? Good. Okay. Anyway, but now that you've created root, you have all these responsibilities to go along with it. In particular, it's very tempting to dink around with the system to try and get it to compile your program.

We do not recommend this in general because you'll take that program over and somebody else will try and build it, and they will not have the modifications on their system. So that's why we're doing sim linking GCC to CC in the system so that every system will behave this way.

I mentioned before about sim linking, adding user include GL slash GL dot H to user include. Again, if somebody doesn't change their system in that way, that will fail for them. So if you do this, you need to come up with some way to, say, have a script they can run as root and modify their system in the same way.

Another thing to be aware of, Maco object files are very different. They are a little bit like ELF, they're a little bit like... I'm sorry, I don't have anything as exciting as whatever they're...

[Transcript missing]

is... Pardon? Oh, Python. That's right. Okay. I'm sorry. Incorrectly slandering Perl. It's Python that I want to slander. Okay.

and actually there was another similar case that actually happens in the GCC sources themselves, but they have a library called libffi, which is supposed to call from Java to native C code, and somebody in there just had to have a mips.s lowercase and a mips.s uppercase. So you check this out from the FSF onto your HFS volume, and CVS doesn't like it at all.

It says you've got a conflicting file. Every time you check it out or update, you've got a conflicting file. Yes, I know it's a conflicting file. And it works fine on UFS. So if you're not aware, your buddy that just happened to be building something on a UFS partition, everything works fine, and on yours it messes up for some strange reason.

So to talk a little bit about debugging, Mac OS X debugger is GDB. It's currently based on the FSF 5.0 release. And for Jaguar, I believe it's based on 5.1. Is that right? Yeah. And it has a lot of additional commands. Don't have space to go into them all. There's a session on Friday that we'll talk about that. We have a future breakpoint that you can set for code that hasn't been loaded in yet. We can print an obc object. And actually, it's almost like a refurbishment. reflection kind of thing.

You can ask for information on all kinds of properties of Mach, such as the Mach tasks. I found info Mach tasks to be kind of interesting because it actually lists all the tasks including the ones that don't belong to you. But I didn't actually go in and try to see what I could do to the tasks that don't belong to me. That seemed like an opportunity.

We've heard quite a bit about GCC3 already. I'll just cover a couple bits that are specific to the command line. The old one is based on 295. The new one will be based on 3.1, unless the FSFP will drag their feet any longer on the 3.1 release. We have the optimization C++ conformance.

We have what's called PFE for precompiled headers. It's a different strategy. I forgot to mention the command-line option. The command-line options are "-dump pch to write out the headers and then "-load pch to load them in again. The release notes for Jaguar will go into that in more depth.

There is something that's important to know about this though, which is in this transitional period, there are a number of places where you can get hosed by GCC3. For instance, its optimization characteristics are different. O3, for instance, will inline code, could cause your program to bloat. There are C++ conformance issues where GCC2 let things go and GCC3 says, "Uh-uh, not going to allow that." So your code simply fails to compile. Indeed, in our very own OS X sources, we ran into a few of those and people had to go fix their code.

Anyway, so as a workaround, we set things up so that both compilers will be available in Jaguar. And this will be true of the ship Jaguar as well as the previews that you have. So the rule is 2952 is always going to be available at userbin/gcc2. That's the permanent home of the executable.

Latest of the version 3, this will go for 3.2, 3.3 or whatever, userbin/gcc3. Then we take userbin/cc and gc and we symlink to one or the other. And then to switch back and forth, we have a script called gccselect. And you just say gcselect2, gcselect3. And this will flip all the symlinks.

So you can have a makefile that, if it refers to gc2 literally, then you can just, you can always refer to that. You don't have to worry about the switching. Anything where you just want to refer to cc and try it both ways, select script is available. And again, that's something where it's going to be part of Jaguar for some time.

For Project Builder, we do have a way to pull in legacy targets into Project Builder. It sets up a lot of defaults and flags. It gives you easier access to documentation. As I mentioned earlier, there is a PBX build command. Sean will do the demo. I have to apologize for showing the GUI in this.

This was actually the largest part of the demo and I was going to demonstrate, if I I lost my terminal, didn't I? Sorry. And I forgot to change that too. But it's real easy to do so if we remember the command and can read it. Alt, right. True aficionados put Terminal in their startup.

I can never remember the path to the project builder script-- or program, I should say, so... Project Builder. As Stan just said, one of the things you can do is use Project Builder to build a legacy application, which is your normal UNIX-type application. So we'll do that. Set all the defaults here. Great. We don't need the release notes. Create a new project. We're going to create an empty project because it's not one of the others.

and David The thing I'm going to be wrapping this round is the GNU Hello World program. And you may think that a Hello World program wouldn't be very complex, but this is GNU. So what did I call it? Now, first thing we need to do is get the files. And I have this under new project, add files. Thank you.

There are the files that I've gotten. Just select them all. Add. We want to copy rather than just refer to them, so that we'll copy the files in there. We're all done. Actually, we're not. The first thing you need to do with the GNU Hello World program is you need to configure it. So we will add a new target.

[Transcript missing]

and Ben Sch will be the program managers. and the action is ./configure and we need to tell it which directory to use. And it's already there. We're all done. Don't sit there. And now let's build it. And as you can see, it ran configure. That was almost as easy as doing it from the command line.

Now that we've done the configure, however, we actually want to build the program. So we do a new target, again, a legacy makefile. Let's call this one hello because that's what All the actions are right. We want to set the directory because otherwise it will use the build directory and the program wouldn't be there. Now, we build. Oops, first we have to select the right target. and Let's try building it again. And there. And now we can run it. Accept. We have not set the executable, so there's nothing to run. Setting the executable is easy.

[Transcript missing]

Build and Run. And look, it printed "Hello World!" Now that might seem like a whole lot of work for just Hello World, and you'd probably be right. However, if you have a more complicated program, you could then use this to set the, use the nice graphical interface to the debugger, which I admit is pretty nice, and almost as nice as Emacs interface to the debugger, which is my personal favorite. There's always VI's interface to the debugger. Bang bang GDB. And voila, and that's my demo. I think that's it.

The interesting part about GNU Hello is it's actually a template. It's not to the point that it does anything, but it provides a template for what a GNU program is supposed to look like.

[Transcript missing]

Thank you, Stan and Sean. Unfortunately, we have extremely little time. We started late due to a small electrical problem, so we will not have time today for Q&A.

Just for your information, technical documentation command line tools all have man pages, and you obtain that man page at the terminal prompt man tool name, as on any Unix system. There are reference documentation also in the developer documentation dev tools directory, and there's specific detailed information for the compiler, preprocessor, GDB, the macro runtime, and these are all available in PDF and HTML forms.

More information is available through various URL tools. These are all wrapped together onto the master URL list, which we list for everyone. A roadmap. Next sessions to look at. For porting UNIX apps to Mac OS X, there's a session immediately following this in Civic Auditorium. We have a feedback forum for technical documentation tomorrow as well as the first of our project builder forums and our performance tools tomorrow. I'd be delighted to see you all there. And with that, we're going to have to end the session because we are already over time. Thank you very much for attending.