Installing Ruby RVM on Ubuntu 12.10

I've never used Ruby even though it's apparently the language of the hour (or maybe some hours ago - I'm often late on such stuff), but had to use it today to run some scripts. I didn't know what RVM was until a couple of hours ago, but when I had to use it, I found out that the package doesn't exist on 12.10 :-/

There's probably some PPA out there with updated packages, but a quick search didn't reveal any, so I decided to install it manually. As the packaged Ruby is kinda outdated, and doesn't match with some of the gems, I also installed a fresh version of that. To avoid clashing with anything that might need the packaged version, I installed it in my home directory. Luckily the RVM packagers made it very simple to do.

Prerequisites:

sudo apt-get install libxslt1.1 libxslt-dev xvfb build-essential git-core curl
Maybe they're not all needed for ruby/rvm, but I followed an instruction on how to run Cumbumber tests for ownCloud. Install RVM (from the RVM install page):
\curl -L https://get.rvm.io | bash -s stable --ruby
Add code to source some init code to your .bashrc (no idea what it does):
echo '[[ -s "$HOME/.rvm/scripts/rvm" ]] && source "$HOME/.rvm/scripts/rvm"' >> ~/.bashrc
Add the bin directory to your PATH:
echo 'export PATH=$HOME/.rvm/bin:$PATH' >> ~/.bashrc
And source it:
. .bashrc
Install necessary dependencies:
bundle install

That's it. Feel free to comment, but I most likely can't answer any questions, because all I did was to collect some snippets from various sources and boil them down to some easy steps :-)

flattr this!

Posted in Ruby, Ubuntu. Tags: , , . 4 Comments »

Backup ownCloud Calendar and Contacts

I have my ownCloud instance in a sub-domain at my hosting provider, so I prefer to make a weekly backup of my calendar and contacts. There is no documented way of doing this afaik, but if you run a *nix-like system where wget is installed, it's actually pretty straight forward. In this example owncloud refers to the URL to the root of your ownCloud installation e.g. http://example.com/owncloud, user is your login and password is your password. The URL to use is the same you use if you want to download the calendar or address book manually. In the upper, right corner of the Calendar and the Contacts app, there's a button named respectively Calendars and Address books. When you click on that, you get a dialog box with a list of your calendar(s) and address book(s). The screen shot below is from the Contacts app. To download manually push the Download button button. Instead of doing that, right-click on it, and select "Copy link location" (or whatever your browser calls it) from the context menu. Choose addressbook Your clipboard will now hold a URL looking something owncloud/apps/contacts/export.php?bookid=1 - for the calendar it would look like owncloud/apps/calendar/export.php?calid=1. Make a simple script using those URLs:
#!/bin/bash
DATE=`date +"%Y-%W"`

# Download and gzip contacts.
wget --auth-no-challenge --no-clobber --http-user=user --http-password=password \
  -O $HOME/pimdata_backup/contacts-$DATE.vcf "owncloud/?app=contacts&getfile=export.php?bookid=1"
gzip -f $HOME/pimdata_backup/contacts-$DATE.vcf

# Download and gzip calendar.
wget --auth-no-challenge --no-clobber --http-user=user --http-password=password \
  -O $HOME/pimdata_backup/calendar-$DATE.ics "owncloud/?app=calendar&getfile=export.php?calid=1"
gzip -f $HOME/pimdata_backup/calendar-$DATE.ics
The --auth-no-challenge is to avoid having the server send an authentication challenge, which would mess up the download. The --no-clobber is to avoid downloading a file that already exists. When gzipping the downloaded file it gets a .gz extension appended to the filename, so it should normally not be an issue though. The backup files will now have names containing the year and week number like contacts-2012-17.vcf.gz and calendar-2012-17.ics.gz. If you prefer to backup in another schedule than weekly, you will of course have to change the date command accordingly. Save the script somewhere preferably in your PATH - I use ~/bin for my scripts - and make it executable: chmod u+x ~/bin/owncloud_backup.sh - or whatever you choose to call it. Test if the script works and set it up to run on a regular basis as a cron job or any other job scheduler; I use KAlarm because it's so damn easy to use ;-) You could use curl instead of wget, it's just a matter of preference. Update: As Klaus Muth mentions in a comment, it cannot be stressed enough that this method should not be used in a multi user environment as it places passwords in the process list at runtime.
Update 2 June 13 2012: I have updated the paths for ownCloud >= 4.3 (git master and stable4). HTTP Auth was broken in versions before that.

flattr this!

Posted in Cloud, Linux, ownCloud. Tags: . 23 Comments »

Yet another MySQL vs. AppArmor barf

I freaking hate AppArmor! Of course only because I don't want to be bothered when an update makes a mess of it - I really don't know how it works but I don't want to need to know either. Some months ago I tried out Logitech Media Server on my box, and it screwed it up big time. Now it seems there has been an update, so it doesn't accept symlinks anymore. It seems logical that it shouldn't, but Ubuntu could have done a better job fixing it - or maybe it's because I had already edited it, that it didn't get updated..? A search lead me to an issue at Launchpad about it, but I've only skimmed through it. Anyways, today when I rebooted MySQL wouldn't run and /var/log/syslog was filled with entries like this:
Mar 30 11:55:31 tanghus kernel: [ 1309.198481] type=1400 audit(1333101331.343:97): apparmor="DENIED" operation="mknod" parent=1 profile="/usr/sbin/mysqld" name="/run/mysqld/mysqld.sock" pid=7192 comm="mysqld" requested_mask="c" denied_mask="c" fsuid=114 ouid=114
Mar 30 11:55:36 tanghus kernel: [ 1314.463559] init: mysql main process (7192) terminated with status 1
Mar 30 11:55:36 tanghus kernel: [ 1314.463606] init: mysql main process ended, respawning
Mar 30 11:56:01 tanghus kernel: [ 1339.105333] init: mysql post-start process (7194) terminated with status 1
Mar 30 11:56:01 tanghus kernel: [ 1339.111425] type=1400 audit(1333101361.335:98): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/mysqld" pid=7291 comm="apparmor_parser"
To fix it edit /etc/apparmor.d/usr.sbin.mysqld and replace the lines: /var/run/mysqld/mysqld.pid w, /var/run/mysqld/mysqld.sock w, with: /run/mysqld/mysqld.pid w, /run/mysqld/mysqld.sock w, and restart mysql by running sudo service mysql restart - if it doesn't respawn by itself. AppArmor should automagically refresh from the change of it's configuration file, otherwise run sudo service apparmor restart.

flattr this!

Is Kubuntu up for a great future?

One could argue so. Harald Sitter (apachelogger) tells that a lot of the base software Kubuntu relies on, will move from the main to universe repository, and that it:
(...) bares a great deal of opportunities for Kubuntu. Primarily it gives the community yet bigger control over what the distribution looks like as we do not need to get software approved to be worthy of Canonical’s support. At the same time it also reduces the policy overhead (main inclusion for those who have heared of it). The detanglement allows us to move even closer to KDE without having to worry about conflicting interests (...)
I have used Kubuntu since one of the first releases (something with a hedgehog?) and enjoy that KDE packages appears in the PPA fast and well tested. One of my major complaints about Kubuntu is that packages such as qtwebkit are totally out of date, and causes crashes and lack of functionality in the up-to-date KDE programs and libraries. Maybe a really community driven Kubuntu will prove to be much better in the long run? Via How Kubuntu Did Not Change | Apachelogger's Log.

flattr this!

Posted in KDE, Kubuntu. No Comments »