简体   繁体   中英

Releasing for Ubuntu

I've built a few pieces of C++ software I want to release for Ubuntu. What ways are there and what can you recommend? Is building .deb files and setting up an apt repo for them the best way? What about make install , is it considered an acceptable way to install software?

By far simplest for me, and perhaps most transparent for the user, would be to just have a github repository in which one could run make install to get all programs installed at one go.

Do I always install the binaries into /usr/bin ?

One of the programs contains Python 3 library code, should that be installed in /usr/lib/python3/dist-packages ? (I don't want to create a pip package, that would make the installation harder -- and waste more of my time.) The program also contains Python 3 examples/tutorials intended for the user to tweak and learn from, where do I install those? Do I create a ~/my-prog-tutorial-dir/ to put them in? If so: how should I name that directory?

Edit: if I simply release the statically linked binaries in a tarball, what will break eventually? Libc? Are there any major application APIs that usually change between Ubuntu LTSs? I only use pthreads, X11 and OpenGL so I suspect statically linked binaries could be a fairly stable option?

In general, building a binary package will make your software much easier for your users to install and keep up to date. The same goes for python packages. There are generally tools to generate apt packages from pip packages, so you can just list your python code as a dependency of your binary package(s).

You may see packaging and installers as a waste of your time, but only providing a source distribution wastes your users' time. Users don't want to constantly have to check github for new versions, and they often don't want to have to install all of your build dependencies if they just want to use your software. If your software is targeted towards developers this may be less of an issue, but it's still extra work that your users have to go through.

As for examples, the general convention is to put those in /usr/share/doc/myprogram/samples or a samples directory in your python package.

The best way to release software for Ubuntu depends on the software itself and its target audience, as Miles Budnek already pointed out.

Your goal is to lower the barriers to software usage. If you are targeting developers of your software (ie, you develop source files that are supposed to be edited by others) or you are developing piece of code supposed to be included in other projects (eg, gnulib), probably it is best to just provide sources and documentation.

In any other case that I currently imagine (including when you are targeting developers), providing precompiled binaries is a better option. In this case the optimal solution would be to have the software in Ubuntu. In this case How to get my software into Ubuntu? provides a lot of useful information, as suggested by Mark K .

Getting software into Debian or Ubuntu can be difficult and may require a large amount of time (you have to respect a lot of policies that you may not be aware of and you have to find a sponsor) and you will soon learn that a key point is to use a decent and popular build system for your software (eg, autotools, cmake, distutils, ...) as mentioned in the Debian Upstream Guide . Being compliant with the guide will also be beneficial for users of other distributions.

In general I suggest to proceed in this order:

  1. provide sources;
  2. use a common build system (from the point of view of system administrators, ie, people installing the software, autotools works best in my experience for Posix systems);
  3. create a binary package (please keep in mind that you have to maintain it or your users will likely encounter binary incompatibilities);
  4. add the package in private repository (I suggest aptly for this task);
  5. try to get the package in the distribution of choice (please keep in mind maintainance costs).

Another option that I do not suggest, is to provide statically linked builds. This reduces the possibility of binary incompatibilities, but augment the costs of bug fixing (eg, if the bug is in a dependency) and security, as explained in this and following comments. Another reason to avoid static linking is if several implementations of the same ABI exists, in order to exploit hardware acceleration (eg, OpenGL), but you can also mix static and dynamic linking.

Finally you may also provide a container, like docker , to ship your software and all its dependencies: your users will just need docker to run your application very portably. However it is probably overkill in most situations and if it is a practical solution or not depends on your application and target audience.

I was asked to expand my comment in an answer, and so I do.

The project I was talking about is called Woodpecker hash Bruteforce , and I distribute it as plain archived executables for Mac OS, Windows and Linux.

Woodpecker hash Bruteforce has only two dependencies I have to care about (the users don't need to install anything): OpenSSL and Botan - libraries to do hashing. I've got two virtual machines on my Mac where I build the project and several scripts to automate the process. I'm using Docker (in collaboration with VirtualBox) and VMware Fusion.

Above I said the users don't need to worry about any third-party libraries because everything's linked statically with the executable: you just download the appropriate file from the official website, unarchive it (if needed), sudo chmod +x the executable and that's it!

This works on any version of Linux, including Ubuntu (this is where I perform the build) and Kali Linux.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM