The vi tutorial I posted here had been converted from a set of
linked web pages and so of course had embedded links.  When I
converted it I used type 1 text file resources to mimic the embedded
link style, but this did not sit well with me.

One thing I noticed is that it broke on older clients. The original
gopher client, for example, which is still maintained as a Debian
package, cannot display text files properly if they are requested as
type 1 instead of type 0.

So I removed the embedded links and put the tutorial files into a
directory, sans gophermap, and renamed the individual files to be
more meaningful (not that using a gophermap is bad, just that it was
not needed in this case).

I don't think it's any faster or slower to navigate the various
tutorial files now, but they can be viewed in all clients. I guess
it appeals to some sort of 'pure' gopher experience based on the
gopher RFC that associates type 1 selectors with directories, not
files. I'm curious also if the use of type 1 for text files in any
way impedes the indexing of gopher sites by veronica or any other
gopher search crawler. I suspect that at worst it might prevent
indexing of the embedded links, if the indexer is expecting only a
file named 'gophermap' or a directory listing.

But that begs the larger question - is it worth making changes like
this to support ancient clients? Is it worth it to meet some ideal
sense of gopher purity? I think the answer to the first question is
'yes', many people still like to run systems where only ancient
gopher clients are available. The "modern" web likes to forget old
clients even exist, let alone ancient ones. Even a web browser a few
years old is unlikely to be able to navigate large swaths of the web
today. I think we should be embracing gopher, not fighting it and
making it more web-like.

As far as gopher purity - this is less clear. Some changes made to
newer gopher servers and clients are welcome. Interestingly, the
gopher RFC [0] has this to say about dynamically generated content
and server improvements:

>The retrieval string sent to the server might be a path to a file
>or directory.  It might be the name of a script, an application or
>even a query that generates the document or directory returned.
>...
>All intelligence is carried by the server implementation rather
>than the protocol.  What you build into more exotic servers is up
>to you.  Server implementations may grow as needs dictate and time
>allows.
         
Which is fascinating to me as in that first paragraph, they are
describing what would become the Common Gateway Interface (CGI). A
few of the more widely used gopher servers support CGI, and I don't
see this as a bad thing. Clients only see what gets sent to them,
they don't see how the server generated it. So even ancient clients
will handle CGI requests. That second paragraph is pretty open as
far as adding server features, as long as the protocol is adhered
to.

For me, the line would be a server feature that impacts client
support, everything else is fair game. Type 1 text files cross that
line. Gopher SSL support is fine (although I don't think it is
needed [1] [2]), as it doesn't replace port 70, it just serves
gopher on another port.

[0]: gopher://gopher.unixlore.net/0/docs/rfc/rfc1436.txt
[1]: gopher://gopher.unixlore.net/0/glog/does-gopher-really-need-tls-encryption.md
[2]: gopher://gopher.unixlore.net/0/glog/comments-on-unencrypted-gopher.md