I recently needed to change my modem due to a technology change, and the new modem did not like to talk SNMP to the big bad Internet. However, it does very happily do so to the local network. While this is probably not an issue for many, I do not have any decently powered server at home, only a rented one outside to do my network statistics.
To overcome the issue, I needed to get my OpenWRT -router to bounce the SNMP packages to the modem, and relay replies back. And while it is seemingly possible with net-snmp snmpd, the configuration is far from straightforward and the OpenWRT packages seem to be lacking in that regard.
Hence I wrote my very own, very simple, proxy daemon. It is not smart at all and simply sends SNMP replies back to the last one to make a request. But it works for my purposes. If you are in similar need, you can grab the source.
I also made a slightly more complex version that allows redirection of different community names to different SNMP servers/agents (no community name rewriting).
While I have been trying to make my desktop experience lighter, I somehow keep grabbing onto gnome-terminal; I guess it's what I've used to use and that I know how to configure with anti-aliased fonts etc. However, the fact that it is missing the ability to launch URIs from command line was hindering my dwm workflow, so I decided to do something about it.
First I tried to use different terminal, namely rxvt-unicode, which has the functionality, but couldn't configure it to my liking in reasonable time (freetype font spacing, mostly), I decided to see if I could hack gnome-terminal to do what I wanted.
At first I used a quite naïve method, just going the characters one by one (from end to start) and calling the VTE check method for each; and while it worked reasonably well it turned out to be terribly slow in some scenarios, with a single scan taking more than a second on a reasonably powerful laptop.
Since gnome-terminal stores the regexes also internally, it seemed like a good idea to try out if I could grab the whole contents of the terminal and run the regexes on it, and see if the results are faster. And indeed they were, but it yielded a slight bug with some uris being reported twice if they overlap. Lazy as I am, I chose to ignore this, and just go with it.
After finding the URIs, it was just a matter of drawing them somehow and handling the keypresses to launch them. I chose a very simple launching mechanism, resembling that of the vimperator/pentadactyl hints mode. Each hint is allocated a character from a set (after the set runs out, the URIs are ignored). Then I drew these hints over the terminal like they were tooltips, and voilà.
The resulting patch is 342 lines (226 lines added) and can be found from here. It should apply cleanly on top of gnome-terminal 3.6.2-0ubuntu1 (Ubuntu 14.10) and with fuzz on vanilla 3.6.2. Applying on top of master takes some handyman work.
During a time void, also known as compile time among programmers, I started to play with an idea that had surfaced earlier in lunch time discussions with colleagues. The more I though about it, the worse the idea seemed, and the more I wanted to see how much work it would be. The original idea was that a git would make a great knowledgebase, giving nice diffs and logs and so on, but the less technologically savvy were a bit reluctant, as the magic black-boxes-with-gray-text are not what they consider user friendly. Hence a better interface was needed. And of course plugging a server side wiki engine to a git backend was too simple of a solution. No no no. Of course it had to be done in browser. And hence giki was born.
So, what exactly is giki? It is a slightly modified js wiki formatting engine, jquery-wikitext glued on self made git repo parser, utilizing inflate.js from zip.js. The result is a very simple read-only wiki, using a git repo over HTTP and doing all the magic in the browser.
Does it work? Sure it does. Is it useful? Not in the least. Why then? Because I could. If you're still interested, see it action (use view source to get to the source). I also used the same git blob parser to implement a simple repo browser.
Remember this hamburger I ate? No? It's alright, why would you. Anyway, the previous one has a big brother. One that is quite angry about what was done to the little one.
Well, foolish as I am, I decided to accept the challenge, and went head to head with the behemoth. Not only is the Ultimate Death Burger more laden with hot sauces, the rules are also more strict.
The first rule: only 20 minutes time to complete the meal. The burger is accompanied by a set of fiery french fries. This means that there is not a lot of room for breathers. Fortunately for me, I'm usually quite fast eater. I went through the fries in a bit shy 5 minutes, and continued onto the burger itself. The burger was quite smooth sailing until about halfway through, when the second rule started to play a role.
The second rule: only one pint of water allowed. After the halfway mark, my mouth felt like it was in flames, and I needed to quench the fire in between bites. Well, that was at first. But one bite later, my mouth would not agree to swallow the napalm like substance, and I needed to fool it with the water. Because water supply was so scarce, it was a delicate balance to have just the right amount. Nevertheless, to my great surprise, I managed to slay the beast, which is when the third, ultimate rule showed its head.
The third rule: Stay seated for 10 minutes. No extra water, no milk, and definitely no vomiting. The seconds on the timer were not running, or even walking. They made snails seem hasteful. Before the time was up, my fingers were numb, and I was generally disoriented. I had a pint of water ready and after what seemed like eternity I got to try to quench the fire within with it. Little use it was, and immediately after the water mixed with the hellish contents of my stomach, it quickly overthrew any opposition that was still holding onto the nutrition, and I quickly had to excuse myself.
So all in all the feat was successful. Painful, moot and stupid, but nevertheless a success. Unfortunately I didn't even get a t-shirt. Only my name on a sheet of paper.
Eating a single simple hamburger, sounds easy right? Well, at least in theory. But sometimes it is not quite that easy. The nightmares are slowly fading away, and I can share my experiences with one devil spawn of a burger.
The day started as a pretty good day. A friend was returning from a vacation, and made a trip with my wife to welcome him back. I called him on the way, and challenged him to eat a hamburger. Unfortunately he had too many excuses, but agreed to eat something else, while I feasted on the burger from depths of hell.
So we went to the restaurant, and I hesitantly placed the order of one "Death Burger," and lots of water. Not that the water would help anything, but you are not allowed to consume dairy products, if you wish to get the t-shirt. And after all, the t-shirt was what I was after.
Getting the burger revealed its horror. The smell of hot sauce was overwhelming. Had heard from multiple sources, that it was a good idea to start from the fries, so that's exactly what I did. While spicy, they were reasonably easy to eat, except that they made my tongue feel like it was being burnt. But after all, they were the easy part, especially after deciding, that it wasn't actually "temperature-burn", but only "spice-burn", I finished them quite quickly.
And then there was the burger. Still smelling of hot sauce. And I don't mean tabasco hot, or texas pete hot. I mean evil hot. Ignoring the warnings of the smell, I dug into the thing. The first bite was not that bad, until the burn spread. I tried hogging in pieces, but after a few had to take a small break. Or maybe a bit longer. Tried to quench the burn with water, but obviously in vain. The clock kept ticking seconds away; after all there was only 30 minutes to finish it. Fortunately I had mates along, who kept me aware of the time. And one that kept saying that water is a mistake.
After the break, which felt like infinity, and during which the burn only very slightly reduced, I decided to ignore all common sense, and just go with it. After all, I did want the t-shirt. And so I did, I took reasonably sized bits, and ate them as fast as I could. Ignoring any indications, that this is not human food, coming from mouth and stomach. And I kept on going, until there was not a bit left on the plate. And that's when the horror started. Fortunately the waitress was there quite quickly with a glass of milk, which was now allowed, as the beast had been eaten. But unfortunately it did not help much. So I went for the bar to order another one, which did not do much either. And then things took turn for the worse, or at least that's what I thought, and my stomach refused to hold it in any longer.
The next day, however, I realized that it was not a bad thing, but indeed a good thing, and I did not have to cope with the aftermath. And I did get the t-shirt. Huge thanks to my wife for understanding and encouraging. And also the the friend-with-excuses for taking the pictures. The bad quality is not his fault, the restaurant was quite dimly lit. And he did try to consume the burger later on, although despite his valiant efforts, failed. Next try in near future. =)
Now, the same place has this "Ultimate Death Burger", only served during special spicy food theme nights, and one you need to sign a waiver before they serve it...
While doing some projecteuler's, I wanted to convert an unsigned long long (64-bit unsigned integer on an 32-bit machine) to mpz_t (and possible vice versa). The API does not have a convenient method for this, but it is possible using the mpz_import and mpz_export; it goes something like this:
unsigned long long ull = 42;
/* mp = ull */
mpz_import(mp, 1, 1, sizeof(ull), 0, 0, &ull);
/* ull = mp; note: this needs bondary checking */
if (mpz_sizeinbase(mp, 256) <= sizeof(ull))
mpz_export(&ull, NULL, 1, sizeof(ull), 0, 0, mp);
For signed integers you'll need some additional tricks, as mpz_import and mpz_export do not support negatives; just negate before and after conversion (hint: mpz_neg()). The boundary checking in export case is slightly more problematic then, and is likely easier to do after export.
While creating set of bash_completion rules for mc-tool, I read some other rule files to see how things are done. I happened to spot some useful functions from git's completion, mainly __git_ps1, which prints the name of the current branch.
Having the name of the current branch can save some mistakes every now and then, and with environment variable GIT_PS1_SHOWDIRTYSTATE you can even make it show if there are non-staged and/or non-committed changes in your tree.
This is how I used it:
By default the output of __git_ps1 is " (name-of-branch)", or "" if current directory does not belong to a git tree. The format string can be given on command line, like " (%s)".
The current version seems to have a small glitch that causes it to print give (unknown) for home directory, if you use git global settings.
While mixing hobby and work development on the same machine, I've every now and then longed for a way to set environment variables depending on the current directory. Up to now had I been too lazy to do anything about it, but finally did it.
What the snippet does, it finds all current user owned .env files from current directory and it's parent directories, checks if the topmost has changed and if it has, reads them all in reverse order. If home directory was not a parent of current directory, the ~/.env is read before others.
If you want this too, look at the code. Include these in your .bashrc, and you should be good to go.
Examples of environment variables I've found useful are DEB_EMAIL and DEB_FULLNAME used by dpkg tools and the equivalents in git world, GIT_COMMITTER_EMAIL, GIT_COMMITTER_NAME.
My interest in the future of MeeGo is quite intense, both professionally and personally. Today the CEO of my employer wrote his thoughts into his blog. It was quite interesting read, and hopefully gives new hope for those in doubt.
Something I did not find documented anywhere: a sensible way to generate QtDBus adaptor and interface classes from introspection data with qdbusxml2cpp. I only found hacks using system() and custom targets to accomplish the feat.
I though to myself that there must be a saner, better way, and while looking at the qmake generated Makefile, I noticed some interesting includes. After some investigation, I found "magic" variables DBUS_INTERFACES and DBUS_ADAPTORS. The introspect files should be named servicename.xml and listed in the variables, and qmake will do all the magic for you.
With these variables, it is really easy to generate the helper classes on-demand. Just remember to include the generated header in some code file, otherwise compilation will choke.
QT = core dbus
TARGET = qtdbus
DBUS_ADAPTORS = fi.inz.hello.xml
SOURCES = hello.cpp main.cpp
HEADERS = hello.h
With this simple .pro file qmake autogenerates the adaptor on build.