Locales... (debian/ubuntu)
Remember to use locale-gen with a list of locales to generate if getting locale warnings.
# locale-gen en_US en_US.UTF-8
Then dpkg-reconfigure locales.
Remember to use locale-gen with a list of locales to generate if getting locale warnings.
# locale-gen en_US en_US.UTF-8
Then dpkg-reconfigure locales.
Remember that fuser can be used to display open files. A whole lot more.
There is of course lsof.
You can use blkid, plenty of good options in the man page.
Also /dev/disk/by-uuid (the other /dev/disk/by-* directories are quite interesting too).
From a remote desktop you can run “shutdown -i” and then go through a reboot menu etc.
Notes on vmware server on Windows 7 Ultimate x64:
While not supported in the UI, you can clone vms
See this post
Note: cloning a machine with snapshots doesn’t seem to work. You can only clone the base image. With virtual box you could get round this by merging all the snapshots into the main disk before cloning, not sure if this is possible with vmware.
You can check connections with:
openssl s_client -connect HOST:PORT
If you need to provide a CA pass it in with the absolute path:
openssl s_client -connect HOST:PORT -CAfile /path/to/ca
Get information about an x509 cert:
openssl x509 -in /path/to/cert -text
Remember man openssl
Recently I’ve found myself playing with Fog[1] quite a lot, if you don’t know it, it is a really nice library for working with different cloud providers. If you’re coding in Ruby and using multiple cloud computing vendors, you should check it out.
One of the nice things about fog is that is supports Amazons S3 multipart uploads[2]. This is a feature of S3 that Amazon recommends you use if the files you want to upload are greater than 100Mb. It just so happened I had a bunch of files that fit the bill.
Multipart uploads are neat as there is no expiry of the upload, you need to either complete it or abort it. This would let you schedule part uploads during times when your network traffic is quiet. You are also able to recover from a single part failing without it affecting the whole file upload.
The basic steps are:
After some hacking about I had a basic script that would take a file, split it, get the Base64 encoded MD5 of the parts and upload them (The hacky results are in https://gist.github.com/907430). This worked well, however I really wanted to upload multiple parts at once to increase the speed so I investigated threading in ruby.
The results are presented below as a proof of concept script. The main thing that had me scratching my head was completing the upload. Originally I had been pushing the ETag from each part onto an array, however as the threads can run in different orders and finish in different times there was no guarantee for the order of the tags in the array. Once I realised this I explicitly set array element to its corresponding tag and the uploads would complete.
The above is far from perfect but it is working for me and I hope it gets the general idea across. I now plan on taking base and turning it into a system that can perform a single upload on small files, and a multipart upload on large files.
[1] - http://fog.io
[2] - http://docs.amazonwebservices.com/AmazonS3/latest/dev/index.html?uploadobjusingmpu.html
— Gavin
I plan on doing some further work on this once I’ve got internet at home again. But for now.
The source and display math uses RPN. There is a list of the functions on this mailing list post:
http://lists.omniti.com/pipermail/reconnoiter-users/2010-July/000467.html
Or you can check:
https://labs.omniti.com/labs/reconnoiter/browser/trunk/ui/web/lib/Reconnoiter_RPN.php
So far I’m working on the ones I can see in Theo’s OSCON presentation video:
As I continue to play with the graphs in reconnoiter I will add more examples.
Using the ubuntu ec2 images from canonical you can have user data execute as a script. Simply start the data with a #!. If you want to log what happens you can log to a file at the start of your script.
E.g
#!/bin/bash
exec > >(tee /var/log/user-data.log|logger -t user-data -s 2>/dev/console) 2>&1
apt-get update
apt-get -y install build-essential
Or can go to syslog with the appropriate logger command.
References:
Remember about debconf-get-selections (part of debconf-utils). Can use it in conjunction with debconf-set-selections to setup configuration seeds for files.
Very useful when installing the Sun Java JRE.
$ cat sun_java.seed
sun-java6-bin shared/accepted-sun-dlj-v1-1 boolean true
sun-java6-jre shared/accepted-sun-dlj-v1-1 boolean true
sun-java6-jdk shared/accepted-sun-dlj-v1-1 boolean true
……
$ sudo debconf-set-selections sun_java.seed
$ sudo apt-get install sun-java6-jre
This sort of stuff can be used with chef, in the package resource, using the response_file attribute.