Monday, September 07, 2020

How to find out which package a file belongs to on Linux/Unix.

This post shows how to find out which software package created a file on a Linux system. The way to go about this task is dependent on the package manager is used. This post shows how it is done in 3 of the most popular package manager: RPM (Redhat/Centos, etc.), dpkg (Debian/Ubuntu, etc.), and pkg (FreeBSD)

rpm -qf <path/to/file>
pkg which <path/to/file>
dpkg-query -S <path/to/file>

Running the commands on CentOs and Ubuntu to find which package created /etc/ssh/ssh_config will look like this:

rpm -qf /etc/ssh/ssh_config
openssh-clients-5.3p1-124.el6_10.x86_64

dpkg-query -S /etc/ssh/ssh_config
openssh-client: /etc/ssh/ssh_config

Thursday, June 11, 2020

How to cross compile Rust on MacOs and target Linux using Docker container

I use a Mac for development, and recently I had to compile a Rust project to be run on a Linux server. This post captures how I managed to get this done. Note that the method highlighted in this post can also be used if you have a Windows (or even a Linux) dev environment.

The first thing I attempted was to install the necessary toolchain that should, in theory, allow me to cross-compile the Rust project on my Mac and target Linux. This did not go as smoothly as I would have loved. The first issue I encountered was with OpenSSL. After fixing that, the next issue was with backtrace-sys with the build failing with `error: failed to run custom build command for backtrace-sys v0.1.34`

This looks like I probably do not have the right environment/toolchain/dependencies installed.

I had the option of trying to identify all the required dependencies, but I was really running out of time and frankly, hunting around for the correct dependencies needed to set up a working compiler toolchain is also not fun. So I ended up making use of Docker to help with the build processes.

What I did was to create an image based on alpine Linux with the required toolchain. I then use the container spurn based on the image for the build processes. The procedure is as follows:

The Dockerfile:

FROM alpine:3.11 AS build
LABEL maintainer="Dadepo Aderemi"

#
# -- Install Rust build toolchain
#
ENV RUSTFLAGS="-C target-feature=+crt-static"
RUN apk add rust cargo openssl-dev

The +crt-static ensures statically linking of runtime dependencies. For more on this see Static and dynamic C runtimes

Now make sure to be in the same directory the docker file is located and create the image by running:

docker build -t rust-linux-builder .

Once built, run and connect into the container with the source code of the project mounted as volume by running:

docker run -it --rm --net=host -v $(pwd):/build rust-linux-builder

Note using $(pwd) assumes your current directory is where the source of the project is located. Also the use of the --rm flag ensures the container is removed automatically on exit. No need to keep the container around after finishing with the build.

Once in the container, switch to /build then run the build command:

/ # cd /build/
/build # cargo build --target x86_64-alpine-linux-musl --release

Once the build completes, exit the container and the binary would be found within the /target directory. Which can then be copied and installed on the linux server where it would be executed.

As you can see, there is nothing complicated in the Dockerfile and you can easily create your own, with your modification if need be. But in case you do not want to do that, I pushed the Docker image to Docker hub. This means you can easily get your rust project, targeting Linux by running:

docker run -it --rm --net=host -v $(pwd):/build dadepo/rust-linux-builder

...and follow the steps outlined above.

Sunday, March 01, 2020

Learning Rust - Day 10 - Smart Pointers

This is the 10th entry of my learning Rust journal...

It captures some of the learning points while going through chapter 15 of the Rust Book. You can read other posts in this series by following the label learning rust.

I enjoyed reading this chapter. I found it particularly interesting because it was about concepts I usually do not need to think about when working with the other programming languages I have used before now. Apart from that, it also allowed me to invalidate some wrong assumptions I had picked up along the way and also consolidates some of the concepts I have been learning.

One of the assumptions I had, which I found out was wrong while going through this chapter has to do with Stack vs Heap. For some strange reason, I had thought that Structs and Enums are always on the Heap. I suspect my familiarity with Java is to blame for this wrong assumption, since if you squint hard enough a struct looks like an Object in Java and usage of new keyword always means allocating memory on the heap.  But this is not the case in Rust. A struct or an enum does not automatically mean heap memory allocation.

Saturday, February 29, 2020

Rust Ownership Rules

If you have been following this blog, then it would have been obvious that at the beginning of this year, I started learning Rust. This blogpost is a breakaway from the journal style of capturing the main points I encountered while reading the Rust book. It instead captures my understanding thus far of Rust ownership rules.

One of Rust's main differentiator is that it provides memory safety. This it does by providing compile-time guarantees that flag code that potentially could lead to memory bugs as a compile-time error. The compile-time guarantees enforce what is normally referred to as ownership rules. In this post, I took the opportunity to re-summarise what I consider to be the essence of this ownership rules in Rust. The key points can be outlined as follows:
  • Values are owned by variables. 
  • When the owning variables go out of scope, the memory the value is occupying will be deallocated.
  • The values can be used by other variables, they just need to adhere to certain rules that are enforced by the compiler.
The ways that other variables make use of value can be grouped into 4 categories and these ways of usage will dictate the rules to be adhered to:
  • Clone: Here the value is copied to the other variable. The other variable gets its own ownership of the copied value, while the original variable keeps the ownership of its value.
  • Move. Here the ownership is handed over to the other variable that wants to make use of the value. The original variable no longer has ownership.
  • Immutable Borrow. Here no ownership transfer occurs but the value can be accessed for reading by another variable. The memory is not de-allocated if the borrowing variable goes out of scope, since the borrowing variable does not have ownership.
  • Mutable Borrow. Here the value can be accessed for both reading and writing by the other variable. The memory is also not de-allocated if this borrowing variable goes out of scope since the borrowing variable does not have ownership.

Tuesday, February 04, 2020

Learning Rust - Day 9 - Closures and Iterators

This is the 9th entry of my learning Rust journal...

It captures some of the learning points while going through chapter 13 of the Rust Book. You can read other posts in this series by following the label learning rust.

This chapter did not present any new or mind bending concepts. Most modern languages nowadays have the concept of functions as first class citizens, closures and iterators. So the chapter was about taking note of how these concepts are encoded in Rust.

I did not really enjoy how this particular chapter was written, especially section 13.01. I think the pedagogy can be improved. The section spends way too much time in motivating an example, that in my opinion, clouds the essence of what is being explained. It so happened that I found another book on Rust Introduction to Rust which ended up being really well written: concise and well explained. I personally enjoyed the chapter on closure from this book than I did reading the Rust Book itself.

So to the content of this chapter, here are some of the things that stood out:

Sunday, January 26, 2020

Learning Rust - Day 8 - An I/O Project: Building a Command Line Program

This is the 8th journal entry of my learning Rust journey. It captures some of the learning points while going through chapter 12 of the Rust Book. You can read other posts in this series by following the label learning rust.

Not much new concepts was introduced in this chapter. The aim was to apply the material presented in the book up till that point in building a trivial command line tool.

While working through the chapter though, I observed a couple of different strategies for dealing with errors via the Result<T,E> type.

Turn error to boolean via is_err
This seems handy when you want to convert the result to boolean based on whether the result was a success of not:

let x: Result<i32, &str> = Ok(-3);
assert_eq!(x.is_err(), false);

let x: Result<i32, &str> = Err("Some error message");
assert_eq!(x.is_err(), true);

Get success or run some logic via unwrap_or_else
This allows getting the success value, and in case of error, allows passing a callback function to process the error.

fn count(x: &str) -> usize { x.len() }

assert_eq!(Ok(2).unwrap_or_else(count), 2);
assert_eq!(Err("foo").unwrap_or_else(count), 3);

Ignore success and only run some code on failure via if let syntax
This seems to be useful in cases where you only want to do something in case calling a function returns an error.

if let Err(e) = run(config) {
    eprintln!("Application error: {}", e);
    process::exit(-1)
}

These are by far not all the available error handling strategies when dealing with Result in Rust. These were only the ones that I picked on while reading through chapter 12.

Another thing worth nothing, which is more or less like a culture shock, is the practice of putting unit tests in the same file as the code being testes. All languages I have used before now had the practice of having the tests external to the code being tested; but it seems in idiomatic Rusts, the tests go together with the implementation. This would require some getting used to!

That was it for this chapter. I now look forward to exploring the Functional Language Features in Rust as I proceed to chapter 13..

Friday, January 24, 2020

Learning Rust - Day 7 - Writing Automated Tests

This is the 7th journal entry of my learning Rust journey. You can read other posts in this series by following the label learning rust.

In this session, I read through Chapter 11 of The Rust Book. It was about writing automated tests in Rust. This chapter was a breeze; for obvious reasons. In fact, the most interesting things I learnt was not even about testing in Rust, but about another feature in the language: Attributes.

I have always noticed things like #[derive(Debug)], #![allow(unused_variables)], etc being used in the language, but I never stopped to actually read up on what they are. I know they were a facility of providing some form of metadata in the language; sort of like @annotations in Java, I just never took the time to find out what they are official called in Rust.

It ended up being that the testing mechanism in Rust revolves around the use of this language feature: notably #[cfg(test)] and #[test], so I took the opportunity to find out what exactly these things were.

They are Attributes and they are:
...a general, free-form metadatum that is interpreted according to name, convention, language, and compiler version...
They can exist in two forms. Outer attributes: the ones that starts with #! and Inner attributes: the ones that starts with  only #. The outer attribute is placed outside something - i.e. before a struct definition, function definition,  module definition etc - it applies to the thing that follows the attribute. The inner attribute is placed inside something. e.g when placed in the (root of a) crate in other to apply an attribute to the crate - it applies to the item that the attribute is declared within.

Attributes can also be classified into the following kinds: Built-in attributesMacro attributesDerive macro helper attributes, and Tool attributes. So far, so good, I find the Built-in attributes to be the most interesting ones, because, based on my knowledge of Rust, they are the ones I have mostly encountered. A list of these Built-in attributes can be found here.

Apart from learning more about Attributes, there were a couple of things I picked up about testing in Rust that are worth noting:

  • Use #[cfg(test)] attribute on module that contain test functions. Use #[test] attribute on test functions within the test module.
  • assert!assert_eq! and assert_ne! are macros that can be used for asserting test conditions.
  • favour assert_eq! and assert_ne! over assert! because they provide more useful messages in case of test failures and allow specifying of custom error messages on test failures. 
  • cargo test --help displays options that can be used with cargo test while
    cargo test -- --help displays the options you can use after the separator --. It looks like that latter can only be ran in the root of a rust project.
  • You can’t use the #[should_panic] annotation on tests that use Result<T, E>. An Err value would need to be returned to signify an expectation of error.
  • Run the tests with cargo test -- --test-threads=1 to prevent concurrent execution of tests 
  • Use #[ignore] attribute to ignoring some tests unless specifically requested.
  • To specifically run a single test, pass the name of the test function to cargo test. That is
    cargo test name_of_test_function 
  • Use cargo test -- --nocapture to also see the outputs (if any) from succeeding tests. By default outputs are shown only for failed tests
  • If there is setup code to be shared across different tests, then make sure to put them inside of tests/common/mod.rs instead of tests/common.rs. Not doing this would make the setup code appear in the test results.
  • Unit tests are placed in the same file as the module, while integration tests go into tests/integration_test.rs and they do not need #[cfg(test)] attribute; only the #[test] attribute is needed.
That was it for learning about writing automated tests in Rusts. I would be taking on Chapter 12 next. Which is: An I/O Project: Building a Command Line Program. It Looks like a chapter that would help solidify some of the concepts presented in the book thus far. Looking forward!