Well I must have over a hundred tips I wrote up on sysadmin and development tasks over the past few years I still need to catch up on putting into this blog.
This is a quick example of installing and enabling Jenkins on Ubuntu. Jenkins is a continuous integration system. Basically it provides a web interface that shows the jobs and their results and the same interface is used to configure it, extend it, automate or manually control jobs. It is commonly used to build and test software on other systems and show the results in a single place. I have been using it for over four years. (Prior to that, I did research it, but wrote my own lightweight and portable testing framework that had at one time or another over 40 systems attached to it.)
While jenkins components were in my "apt-cache search jenkins" list I didn't see any main or meta package and all docs I found said to get from upstream:
$ wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add - $ echo deb http://pkg.jenkins.io/debian-stable binary/ | sudo tee -a /etc/apt/sources.list.d/jenkins.list $ sudo apt-get update $ sudo apt-get install jenkins
The above also enabled and started the jenkins service. It is a java-based software. By default, its webserver listens on port 8080. So visit it. At first start, it needs a password. Mine was at /var/lib/jenkins/secrets/initialAdminPassword . It was running Jenkins 2.7.4.
Then choose suggested plugins or choose own. I chose my own. I selected "SSH plugin". It has others already checked so I left those suggestions as-is. Then it showed the "Getting started" progress bar and showed the plugins getting installed and their required dependencies.
Next create first admin user, and click the "save and finish" button. Then click "start using jenkins" button when it says setup is complete.
Welcome to Jenkins.
The main interface has a "Please create new jobs to get started." link. Follow it. Enter item name. For a quick test, I entered "ps". Then choose type. I chose Freestyle project. Then can click OK. Then it allows further details and options for it, like parameterized, discard old builds, use git, setup build triggers like periodic or SCM polling, etc.
Under the "Build: Add built step" section, I selected execute shell script on remote hosting using ssh. For the action I entered "ps auxwww". Note the ssh site is empty dropdown and cannot type into it. So save and then at main jenkins go to Configure System (/configure link). And then for SSH remote hosts click the Add button for SSH sites (that projects will want to connect). Be sure to add port number (22). I created a testing account for this (useradd -m ...; sudo passwd ...).
Back at my new project configure page it now shows the user@host:port entry for the SSH site. I clicked save. Note that the common Jenkins use is to have a java-based Jenkins server running on each remote system. The SSH way is a lightweight way instead.
I clicked Build now and it said job scheduled and a moment later I had a build history #1. Clicked it and then "Console output" and saw the ps output.
Jenkins has a lot of plugins and features for integrating with many build and test systems with properly showing results and understanding output such as identifying changes between runs.
Long ago I wrote some articles and gave some lectures about simplifying Linux systems by running less processes. Today I looked at my Ubuntu laptop and saw it always has a load average around 0.33 and over 80 processes running. There were some obvious programs I didn't need and some I didn't know about. I used dpkg -S to find out what packages the processes were included in. (Some didn't have man pages.)
I removed modemmanager, edited /etc/default/console-setup and moved /etc/init/tty.conf to reduce my virtual terminals, removed tor and a GeoIP database for Tor, removed colord (why do I need a system daemon to manage device colour profiles?), removed winbind and its PAM plugin (why do I care about Windows domain user/group lookup?), and ntp. I also added to my crontab to run ntpdate twice a day instead. There is more to understand -- like why do I need a daemon to provide a DBUS interface for adding, modifying, and deleting user accounts (accountsservice package)?
This last month, we published another BIND DNS book. Our second edition of our custom extended and improved reference manual. The first book was printed in September 2007. Since then we published around nine other books (including a printed DNSSEC specifications book) — and I started working full time in the DNS field (probably because the first edition helped me in the door). For the first few years, we sold lots of the book and it was given to many BIND DNS students throughout the world. We were asked several times for an updated edition, but each time we got started, we ended up getting far behind as the technology (and book) changed. The first edition was done in LyX without any real revision control. The second edition was done in Docbook using Git.
Since the first edition, BIND added many new features, including:
- dlz search
- logging file versions
- DSCP support for traffic classification for quality of service
- managed-keys for automated updates of DNSSEC trust anchors
- rndc addzone, delzone (and allow-new-zones)
- rndc flushtree to selective remove zones from cache
- auto-dnssec for automated signing
- rndc signing
- rndc scan and automatic-interface-scan
- deny-answer-addresses and deny-answer-aliases for content filtering to prevent DNS rebinding attacks
- dns64 for AAAA queries to IPv4 mapping
- dnssec-validation auto
- filter-aaaa and filter-aaaa-on-v4 and filter-aaaa-on-v6
- GeoIP (and geoip-directory)
- max-recursion-depth and max-recursion-queries
- prefetch to requery for popular lookups to keep in cache
- rate-limit (with 15 options)
- response-policy (with 9 policies)
- rndc secroots and secroots-file
- session-keyalg, session-keyfile, and session-keyname
- sig-signing-nodes, sig-signing-signatures, sig-signing-type, and sig-validity-interval
- use-v4-udp-ports and use-v6-udp-ports
- masterfile-format to keep zone files in raw or memory instead of text
- named changed behavior to remember the case which could be turned off with no-case-compress
- in-view to share master files
- static-stub zones
- redirect zones
- additional update-policy policies: local, tcp-self, 6to4-self, zonesub, and external
- server-addresses and server-names
- rndc sync
- rndc zonestatus
- delv tool
- dnssec-checkds tool
- dnssec-coverage tool
- dnssec-dsfromkey tool
- dnssec-importkey tool
- dnssec-keyfromlabel tool
- dnssec-revoke tool
- dnssec-settime tool
- dnssec-verify tool
- named-journalprint tool
- named-rrchecker tool
- ddns-confgen tool
- arpaname tool
- genrandom tool
- isc-hmac-fixup tool
- nsec3hash tool
In addition, some features were deprecated or changed:
- queryport-pool-ports, use-queryport-pool, etc
- stats-server became statistics-channels
Our book also covers several other bleeding edge features like:
- dyndb (dynamic database) for external data source
- buffered logging
- lwres-clients and lwres-tasks
- DNS cookies with cookie-algorithm, cookie-secret, nocookie-udp-size, require-server-cookie, send-cookie
- fetch-quota-params, fetches-per-server, fetches-per-zone
- limit of files concurrently open/li>
- notify-rate and startup-notify-rate
- rndc nta for Negative Trust Anchors to temporarily disable DNSSEC validation (with nta-lifetime and nta-recheck)
- response-policy log
- serial-update-method date
- servfail-ttl to cache SERVFAIL responses
- rndc managed-keys
- rndc modzone and showzone
The BIND DNS Administration Reference book is the only printed book covering all these topics. Note that the most popular DNS book is ten years old so cannot cover the above features as covered in our book. Our book also includes installation, examples of using vendor packages, and lots of other original content, plus detailed indexing and additional cross-referencing.
Book details are at http://www.reedmedia.net/books/bind-dns/ or order it from your favorite book store.
My first real job after I finished my bachelor's degree was as a Unix admin for a local Internet service provider. While I had some professional background in Windows installations, web development, and Windows-related bug writeups (including in Infoworld). I had no real commercial experience with Unix. My degree was in journalism and my previous degree was in Physical Education. For the past two years, I had a login on a Sun system (from the university) and a Debian Linux system (from the computer club) where I learned basic HTML and CGI programming using Perl. I wrote a website management system for news and magazine sites which I used for the university newspaper and sold to other news and magazine organizations.
I had installed Debian a few times and started reading Unix-related books. I experimented in a home lab of old junk computers implementing various services as if it was a commercial Internet enterprise.
Somehow I convinced the ISP (IWBC) to hire me even though I had no experience with the operating system they used. They did tell me that they wanted to hire someone with a bachelors degree, so that was good. They offered dial-up service, DNS, email, and web hosting for homes and companies. They also published online a website linked to pre-approved sites (like a "Yahoo" directory).
My role was to learn eveything I could from the existing head admin who was moving in three months. The guy was a jerk --- this was before I learned about the Unix BOFH (the sysadmin who takes out his anger on users, colleagues, and more). I followed him around the office with a notebook and watched him as he added user accounts on BSD/OS, added DNS zones in BIND, enabled and test email users, tweaked Sendmail, added virtual hosts to Apache, etc. I took pages and pages of notes when we upgraded or installed BSDI. I wrote down commands, command line switches, expected output, error messages, and more.
Soon he was gone, and I became the head admin. We only had around 500 domains (websites) but maybe around 10,000 dialup customers. Everyday we had a variety of typical sysadmin tasks. But I had my notebook. I worked with the sales and phone reps so they could handle most tasks without me (like changing passwords).
I wrote scripts to automate my work, such as adding new email users, or adding new domains included in named.conf or hosts in httpd.conf.
I joined various email lists to discuss the software I used with experts in their communities. I asked questions. And I took notes.
As I improved my routines and automated the processes, I had more and more free-time. My co-workers played online bingo and built servers for downloaded music.
I read man pages. I read more man pages. I checked out every Unix and related book from the local libraries and took careful notes.
I installed extra systems on surplus hardware to learn about later for development versions of the software we used or to test out alternative software. This resulted in the time I had a system root compromised; thank you BIND 8, as this taught me about server security and better administration practices. (This experience led me to years of working as a security consultant and auditor.)
Then I started answering more and more questions on the mailing lists. I tried to solve every problem posted. Before I read any follow-up emails, I tried to understand the issue, read the related docs, sometimes experiment with the software or configuration, and develop a solution or answer. I continued to take careful notes about the bugs and the solutions. For a while, I hosted a website with ISP Frequently Asked Questions.
Our ISP started to fail. The company spent all its money on expensive VOIP hardware, unused Oracle database licenses, and new tech staff that sat at their desks literally doing nothing --- no computers even assigned after months. Again as my co-workers played, I studied. I'd go to the developer's offices and ask them to teach me to code. I shared my examples with them and read their code. I found open source software that we used for our servers and for my day-to-day sysadmin tasks and thought of ways to improve them.
One issue was we didn't want to have hundreds or thousands of Unix user accounts simply just to offer email service. (That is the users didn't need to have any Unix login or user privileges.) So I found a light open source POP3 mail server and extended it to support virtual users. I shared back over ten patches over time and soon became maintaining a fork with a new name of the software. It was the only option out there at the time, and ISPs around the world even with tens of thousands of users started using it and I received lots of feedback and code contributions.
I started attending a Linux users group about an hour from muy house monthly and soon I was giving lectures and sharing my new knowledge. I sent proposals to local computer newspapers and magazines and wrote over 15 articles related to Unix and open source software.
I was able to get another job as a journalist and sysadmin to start a website for BSD news and tutorials. The company (Internet.com, later Jupiter Media) sent me a $8000 server to install and I was actually paid another $3500 a month to learn! This is where I started with my operating system of choice. (Later that job extended to cover Apache news and tutorials, PHP news and tutorials, and even Linux news.)
I researched Unix sysadmin roles and did job task analysis surveys privately and publicly to understand work for web server operators. I installed lots of different solutions or alternative softwares to compare and contrast and to learn other points of view. As I experimented, if I found bugs or thought of enhancements or improvements, I would send my notes and even code fixes to the software developers. (These activities helped lead to my later work as a certification expert and a packaging expert.)
When the ISP failed, I was the last employee left. The only person to make sure the systems stayed running until they were powered down. (In hindsight I should have taken all the customers.) An accountant joined me at the empty office one day and told me they couldn't pay me -- for the third month in a row. He wrote down on his clipboard as I took multiple desks, a rack, several servers, office chairs, photocopier, and miscellaneous office supplies -- which I gave away for years and used for later learning and business.
Soon I began advertising my new expertise as a consultant and trainer ... but that is another story.
The point of this short period of learning is:
- Read every book I could find about it; take detailed notes and read back through the books again for page numbers I highlighted to try the examples.
- Ask co-workers to teach me and take careful notes.
- Read the documentation for the software. Read extra manuals too.
- Join communities to ask questions and also to share experience.
- Don't waste time, but keep learning and experimenting.
- Document what was learned in detail.
- Share with others and you will learn more.
This short path in life was different than what I expected when I went to college, but it turned my entire life around. It changed my focus and gave me new direction. It jump-started a career in helping others via open source software.
I am at the Internet Engineering Task Force meeting this week. The IETF is the primary organization that creates and publishes internet standards, such as HTTP for accessing web documents and for sending email. The IETF is an interesting group — as it doesn't have real membership, no voting, and all standards work is done voluntarily. All the published documents are freely available for individuals, companies, software developers, and hardware manufacturers to create solutions that will work with others. Computer scientists throughout the world have travelled to Dallas this week to improve the internet's security and efficiency.
I am joining sessions about Scalable DNS Service Discovery, DNS PRIVate Exchange, Home Networking, Hypertext Transfer Protocol (HTTP), Domain Name System Operations, DNS-based Authentication of Named Entities (DANE), Sunsetting IPv4, Dynamic Host Configuration, and maybe more.
The volunteer process of the entire effort that results in a better internet amazes me. While governments and companies may mandate some technologies, it is interesting that these core technologies are voluntarily created and voluntarily used. Another book idea?