Copyright © 2015-2021,2022 by Thomas E. Dickey


This is a collection of simple tools which I wrote during the 1990s. Although a couple (cpd and sue) are older, I wrote most after leaving the Software Productivity Consortium in 1994.

Because I use a couple (isatty and newpath) in my shell initialization, these are (after ded and vile) the first programs that I compile for setting up a new machine.

Unlike most of my programs, there is little documentation aside from this webpage. My reason for writing this webpage is to provide a resource for the other pages.



I use acmerge as part of my process for updates to/from my-autoconf macros. Given the file created by acsplit and the subdirectory AcSplit containing macros, this program creates a new aclocal.m4 file.

The program adds the macros in the order in which they are listed in; I use a text-editor (vile, of course) to sort the list.


acsplit splits an aclocal.m4 file into one file per macro, in the subdirectory AcSplit. It also creates the file, from which the acmerge program can reconstruct the original file.

There are some limitations on syntax which are addressed by my coding style. For instance, I use “dnl” comments between macros to help the program find macro begin/end.


I wrote chrcount to help with testing ncurses.

It counts characters from one or more files, optionally converting on the fly from printable form to nonprinting form. The program is useful for getting metrics from map/unmap output files.


count_files is like the POSIX utility wc, except that this assumes that the standard input is a list of pathnames.


The cpd program copies the modification timestamp from the source file (first parameter) to the target files (succeeding parameters).

I use this program in the archive script, for instance, to set the timestamp on the resulting tarball to reflect its contents. You can achieve a similar effect with zip using its “-o” option.

Although cpd is the second oldest of the programs in this collection, I based it on an older script which I wrote while at the ITT Advanced Technology Center in 1982 or 1983. The utility in VM/SP CMS for copying files had options which could be combined to copy just one line from a file or copy from multiple files, preserving the modification-time from the source file. I used the feature to write a “copydate” script, which I put to good use.

POSIX touch since the mid-1990s (seen in X/Open CAE Specification Issue 4, Version 2, 1994) has provided a –r option which gives the same functionality. According to the rationale, this was not based on existing practice:

The -r option was added because several comments requested this capability. This option was named -f in an early proposal, but was changed because the -f option is used in the BSD version of touch with a different meaning.

I still use my utility, from habit and because it is less typing.


Originally written to convert to/from hexadecimal, hex also displays the characters which correspond to its parameters, both as ASCII and UTF-8. The program displays all of the possible results for each parameter. For example:

hex 0x1234567 1234567 01234567


0x1234567: 19088743 0110642547 0x1234567 text "\001#Eg" utf8 \371\210\264\225\247
1234567: 1234567 04553207 0x12d687 text "\022\326\207" utf8 \364\255\232\207
01234567: 342391 01234567 0x53977 text "\0059w" utf8 \361\223\245\267


If the standard input and standard output are both connected to a terminal, isatty exits with a “success” status, otherwise it exits with “failure” status.

I use the feature in scripts to suppress prompting if they are not being run interactively. At the time I wrote this in November 1994, I was using csh, e.g., on a SunOS 4.1.3 machine, which had no convenient method for this. At the same time, my home directory was NFS-mounted on a dozen other types of systems where I did development. I used isatty in the shell initialization to decide if it should call newpath to set the PATH to reflect the local machine's directories.

I was aware that some variations of Bourne shell provided a way to do this, e.g., using the test utility, but was uncertain if that was portable. For instance, I had used systems where [ and ] were not available as links to test and was uncertain if test itself might be only a built-in for some version of Bourne shell. If that was the case, it would not be usable from csh.

As it happens, POSIX calls test a “utility” by which is meant that a compliant system provides test as a program which can be run independently of sh. Assuming that was the case for the various systems I used in 1994, that might have worked in csh like this:

if ( { test -t 0 } && { test -t 1 } )

rather than

if ( isatty )


I wrote map along with unmap to help with testing ncurses.

unmap reads one or more files (or the standard input if no files are given) which were written by unmap, converting back to a mixture of printable and control characters.

The program recognizes one option, “-u” to tell it to encode UTF-8.


Given one or more files specified either on the command line, or via a pipe, compute the date and/or name of the newest one, printing to stdout.

The program accepts two options,

print the modification-time of the newest file.
print the name of the newest file.

If neither option is given, “-d” is assumed.


newpath prints a modified version of PATH (or optionally, another variable using the same syntax). I wrote this in 1994 to help with reusing my shell initialization scripts with systems which have different execution paths—and NFS-mounted directories. Rather than wait for NFS-timeouts, newpath can be used to prune the environment of non-responsive parts.

Here is its usage summary:

Usage: newpath [options [directories]] [ - command]

Echos the PATH environment variable with the given directories added
(the default) or removed.

    -a NAME modify path after given NAME
    -b      put new arguments before existing path
    -d      remove duplicates/non-directory items
    -e      put new arguments after end of path
    -f      allow filenames to match, as well as directories
    -n NAME specify environment-variable to use (default: PATH)
    -p      print in C-shell form
    -r      remove arguments from path
    -v      verbose

Put a '-' before a command which is invoked with the environment variable
updated, rather than echoing the result to standard output.


perror interprets its parameters as errno values, printing the corresponding error messages. For example

perror 1 2 3


perror: Operation not permitted
perror: No such file or directory
perror: No such process


Like isatty and perror, realpath is little more than a wrapper around a C library call. Oddly enough, this one (by far the most recently standardized) is the only one where there is likely to be a program with the same name.

Programs with the same name and different function are a problem. I wrote my (simple) program in 1994, but did not publish it. Here is the comment header:

 * $Id: index.html,v 1.27 2022/04/10 20:44:58 tom Exp $ 
 * Title:       realpath.c 
 * Author:      T.E.Dickey 
 * Created:     09 Jun 1994 
 * Function:    Uses the SunOS-specific 'realpath()' call to resolve the 
 *              pathnames given as arguments into absolute pathnames.  This 
 *              seems to work better than 'pwd' in the shell script idiom 
 *                      FOO=`cd foo; pwd` 
 *              since SunOS does not always resolve the directory properly 
 *              when the current directory is mounted. 

My initial version was 65 lines. Lars Wirzenius wrote a simpler program (34 lines) in 1996, according to a package description, which has grown a lot. Because I do not use any of the options, this instance works for me. The misc_tools configure script drops my version from the build if the later one is found.


slowcat writes a file to standard output slowly.

The program accepts an option, a delay multiplier from 2 to 9:

slowcat foo

delays 5 milliseconds per character, while

slowcat -9 foo

delays 45 milliseconds per character.

Unlike realpath, there are several programs with the same name, all unrelated. I wrote this slowcat (early 1997), somewhat later than the other simple utilities I wrote for testing ncurses. Later I added those to my archive area, mentioned in this discussion in November 2000. But none of these appear to have been influenced by my program. More likely, the Perl script was more familiar:


I wrote splitit in 1994 to help with backing up my development files. I had started working with Linux late in March 1994, from an initial set of 60 floppy disks. Most of that was a Slackware distribution. My machine had no tape drive, certainly no CD burner. So my fallback plan for backup was... floppy disks. Using MS-DOS, of course. There was a program called splitit (this might be relevant). I called mine splitit because the MS-DOS program was part of the same procedure. Other developers made the same choice, e.g., this for Amiga.

Writing to floppies was a short-lived idea. But I found it useful to be able to dump a filesystem to a FAT32 partition, to burn CDROMs or to load into other machines on a multiboot system. Rather than write a tar file which may have been too large for some program used in the procedure, my script chopped it up into suitably small chunks.

Since focusing on virtual machines in 2010, I rarely use multiboot configurations, except for two old file-servers.


sue provides a "su" command which preserves the caller's environment variables.

The misc_tools configure script checks if sudo is installed, and if so, installs a script “root” which uses sudo. Otherwise, it installs this program. Additionally, it has an option --with-sudo-hacks to install other copies of sue, named for specific users. This is a sharp-edged tool, which I use in managing source-archives (specific users) as well as to simplify scripting differences between the remaining systems without a working sudo and those with.

By way of background, I wrote sue in 1990, some time before sudo appeared on the scene. Referring to this page, I might have noticed comments about it in 1994, but more likely saw it in use at the end of the 1990s. The cited history of course is from the viewpoint of sudo's developers.

Accordingly (disregarding the developer's joke), sudo has been available for some time, and I do use it when it is available and suitable.

However, sudo does not preserve the caller's environment variables. That makes it less than useful for the case where I am managing source archives on a local machine (using scripts where a password prompt would interfere). Typically, by default, sudo has its timeout set to 15 minutes. In some environments, it may be less than that.

A setuid wrapper is not recommended for general use. But sudo is not a panacea:


Like acmerge, this program merges together parts of a split-up time. Specifically, timerge merges parts for a split-up terminfo source file (usually terminfo.src) from a parent file which was processed by tisplit.


Like acsplit, this program splits a file into pieces. Specifically, tisplit splits a terminfo file (usually terminfo.src) into a parent file which can be reprocessed by timerge, and the entries in a subdirectory. The subdirectory name is TiSplit.

I have used this rarely, e.g., to analyze differences between ncurses' terminfo.src and the variant once maintained by Eric S. Raymond. The results are summarized in the ncurses FAQ.


unmap translates one or more files (or the standard input if no parameters were given) containing nonprinting characters into visible form.

The program recognizes one option, “-u” to tell it to decode UTF-8.

When I wrote unmap/map I had in mind the vis/unvis programs from 4.3BSD Reno. However, I wrote these to address two goals:

Reflecting on this, I chose to keep unmap simple, portable and to make the output useful for debugging escape sequences:


width displays lines from a text file that are longer than a given threshold (usually 80).

The program recognizes several options:

Usage: width [options] [files]
  -4     set tabs to 4
  -8     set tabs to 8
  -n     show line-numbers of wide lines
  -p     report per-file (otherwise the maximum of all files is computed)
  -q     suppress listing of lines wider than the -w option
  -s     print summary showing the number of lines for each length
  -t XX  set tabs to XX
  -w XX  set threshold to XX, showing all lines that are wider
(If you do not specify tabs, they will be counted as single-columns)
Use a '-' instead of [files] to process a list of filenames from the
standard input.

I use this to check for too-long lines in change-logs, where I would like to keep the file within 80 columns.


html (plain)


(current) (archives)