nginx, cgit and git-http-backend

January 27, 2018 on 11:34 am | In geeky | No Comments

Sounds simple, right. Plug in cgit and git-http-backend with nginx to get nice web interface and working clone URL. And pushable too, of course. It turned out not to be quite that easy, but seems doable with some quirks.

There are plenty of instructions for parts of this lying around, but didn't find one that catches 'em all, so needed to some cuttin', pastin' and retryin'. The end result nginx configuration:

GeSHi Error: GeSHi could not find the language nginx (using path /home/inzane/public_html/blog/wp-content/plugins/codecolorer/lib/geshi/) (code 2)

cgit also requires configuration, it could be done system wide with /etc/cgitrc, but I opted defining CGIT_CONFIG environment variable to point to a custom path, the .cgitrc ended up like something like this:



The binary and socket paths, and cgit data path, are those used by Debian default configuration, may need adjustment for different installations. Tried to get rid of the virtual-root directive in cgitrc, but that would require setting SCRIPT_PATH, which fcgiwrap eats away.

For more access control, you could grab the repository name from the request paths: "(? /(?.*)/", and integrate the $repo into auth_basic_user_file.

node.js and wordpress sessions

February 2, 2017 on 11:27 am | In geeky | No Comments

For a leetle project, I needed a way to validate a wordpress session from node.js. WordPress uses a somewhat complicated session system, with HMACs and using part of password salt, and was unable to find a ready puzzle piece for the purpose. So I wrote my own.

The result is a javascript module. Sample usage:

var wps = require('wpsess');
var vdtor = new wps.Validator('/path/to/my/wp-config.php');
vdtor.validate('value_of_my_logged_in_cookie', function (err, username) {
    if (err)
        console.log('Authentication failed: ' + err);
        console.log('Logged in user: ' + username);

It is way from perfect, but it works well enough for me.

xterm.js + ssh2 + websocket-stream + browserify = webssh

January 2, 2017 on 11:24 am | In geeky | No Comments

While pondering ideas for a website facelift, there was an idea to have a terminal for easy ssh access. However, none of the readily available options really suited my fancy; wssh came close, but it was actually doing ssh on the server side, and throwing the raw data inside websockets.

Fortunately npm is full of nice bits and pieces to build on, and browserify makes it quite easy to use most of them in a browser too.

Long story short, in the end it only required about a screenful of glue code to tie in the bits, and voilà. Place the code in webssh.js, grab also the test html file.

$ npm install xterm ssh2 websocket-stream browserify
$ `npm bin`/browserify webssh.js > bundle.js
$ websockify 8022 localhost:22

(You'll need npm and websockify for the above). Then launch the html file in your favorite browser, and login. Tune the addresses to your liking.

There's also a demo version, just enter your websocket endpoint (for example ws://localhost:8022), username and password, and on you go.

CSS / SVG filters for fun and profit

September 8, 2016 on 6:02 pm | In geeky | No Comments

Nowadays the web technologies support nice and fancy things, such as CSS filters. The basic filters are pretty nice for many interactions, like hover effects etc. However there is also support for SVG effects, which can be really complex and produce some really nice results. I wanted to share some little hacks with which I had fun. Unfortunately the browser support is not quite there yet, in my experience these work best in Firefox, but YMMV.

First up, a very simple "bloom" filter


        stdDeviation="5" />
        operator="arithmetic" k2="0.8" k3="1" />

This is very simple, just blur the image and add it up with the original, this makes lighter areas "leak" and makes them look bright.

Next up, slight variation of the previous, unsharp mask:


        stdDeviation="5" />
        operator="arithmetic" k2="-0.3" k3="1.3" />

Here, instead of adding the blurred version, we substract it from the original, giving areas with higher contrast some depth.

Next up an homage to simpler times:


<!-- Pixelize the image using a 2x2 pixel image and displacement map -->
<feImage width="2" height="2" xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAIAAAACCAIAAAD91JpzAAAAEklEQVQI12P4/5/hPwMDA4QAACfmA/2h2gQ5AAAAAElFTkSuQmCC" />
<feTile />
<feDisplacementMap in="SourceGraphic" xChannelSelector="G" yChannelSelector="R" scale="1" result="pxl" />

<!-- Map all color channel values <0.25 to 0, and >0.33 to 0.5 -->
<feColorMatrix type="matrix" in="pxl" values="1 0 0 0 -0.33 0 1 0 0 -0.33 0 0 1 0 -0.33 0 0 0 1 0" />
<feColorMatrix type="matrix" values="10000 0 0 0 0 0 10000 0 0 0 0 0 10000 0 0 0 0 0 1 0" />
<feColorMatrix type="matrix" values="0.5 0 0 0 0 0 0.5 0 0 0 0 0 0.5 0 0 0 0 0 1 0" result="halftones" />

<!-- Map all color channel values <0.66 to 0 and >0.75 to 1 -->
<feColorMatrix type="matrix" in="pxl" values="1 0 0 0 -0.66 0 1 0 0 -0.66 0 0 1 0 -0.66 0 0 0 1 0" />
<feColorMatrix type="matrix" values="10000 0 0 0 0 0 10000 0 0 0 0 0 10000 0 0 0 0 0 1 0" />

<!-- Add the two together -->
<feComposite in2="halftones" operator="arithmetic" k2="1" k3="1" />

This effect uses displacement map filter to make the image 2x2 pixel squares, for the retro feeling and abuses color matrix result clamping to reduce colors so that channel values can be only 0, 0.5 or 1, i.e. 27 different colors. This quantization can also be achieved using feComponentTransfer with big lookup tables for each feFunc[RGB].

And last, probably also the least, even simpler times:


Check the source if you want to see how it is made, essentially using the same tricks as in the retro filter, but making the tiles 8x16 and instead of using the quantized color values, use images of 8x16 console font characters to get a ascii art-ish result. This can be made pixel-perfect on firefox, but to have it also working in chrom(ium) at least somehow, some characters appear distorted.

Made conky eval useful

November 9, 2015 on 8:44 am | In geeky | No Comments

Conky's eval seems rather useless, at least I couldn't get it do anything I wanted, so I added a little patch to make it more useful (to me):

diff --git a/src/conky.c b/src/conky.c
index 5848b61..8702cea 100644
--- a/src/conky.c
+++ b/src/conky.c
@@ -1103,7 +1103,9 @@ void generate_text_internal(char *p, int p_max_size,
 #endif /* IMLIB2 */
                        OBJ(eval) {
-                               evaluate(obj->data.s, p, p_max_size);
+                               char buffer[max_user_text];
+                               evaluate(obj->data.s, buffer, sizeof(buffer));
+                               evaluate(buffer, p, p_max_size);
                        OBJ(exec) {
                                print_exec(obj, p, p_max_size);

Probably not the best thing ever, but seems to do the trick for me; now I can get the address of the interface connected to the big bad internets with (note: won't work correctly when multiple interfaces with default route):

 ${eval $${addr ${gw_iface}}}

The patch applies against conky 1.9.0, unfortunately not against the heavily rewritten git master.

Update: similar patch I made against the 1.10.x has been merged to upstream conky; yay!

Simple SNMP proxying

August 20, 2015 on 7:34 am | In geeky | No Comments

I recently needed to change my modem due to a technology change, and the new modem did not like to talk SNMP to the big bad Internet. However, it does very happily do so to the local network. While this is probably not an issue for many, I do not have any decently powered server at home, only a rented one outside to do my network statistics.

To overcome the issue, I needed to get my OpenWRT -router to bounce the SNMP packages to the modem, and relay replies back. And while it is seemingly possible with net-snmp snmpd, the configuration is far from straightforward and the OpenWRT packages seem to be lacking in that regard.

Hence I wrote my very own, very simple, proxy daemon. It is not smart at all and simply sends SNMP replies back to the last one to make a request. But it works for my purposes. If you are in similar need, you can grab the source.

I also made a slightly more complex version that allows redirection of different community names to different SNMP servers/agents (no community name rewriting).

Keyboard URL launching in gnome-terminal

December 11, 2014 on 8:06 am | In geeky | No Comments

While I have been trying to make my desktop experience lighter, I somehow keep grabbing onto gnome-terminal; I guess it's what I've used to use and that I know how to configure with anti-aliased fonts etc. However, the fact that it is missing the ability to launch URIs from command line was hindering my dwm workflow, so I decided to do something about it.

First I tried to use different terminal, namely rxvt-unicode, which has the functionality, but couldn't configure it to my liking in reasonable time (freetype font spacing, mostly), I decided to see if I could hack gnome-terminal to do what I wanted.

At first I used a quite naïve method, just going the characters one by one (from end to start) and calling the VTE check method for each; and while it worked reasonably well it turned out to be terribly slow in some scenarios, with a single scan taking more than a second on a reasonably powerful laptop.

Since gnome-terminal stores the regexes also internally, it seemed like a good idea to try out if I could grab the whole contents of the terminal and run the regexes on it, and see if the results are faster. And indeed they were, but it yielded a slight bug with some uris being reported twice if they overlap. Lazy as I am, I chose to ignore this, and just go with it.

After finding the URIs, it was just a matter of drawing them somehow and handling the keypresses to launch them. I chose a very simple launching mechanism, resembling that of the vimperator/pentadactyl hints mode. Each hint is allocated a character from a set (after the set runs out, the URIs are ignored). Then I drew these hints over the terminal like they were tooltips, and voilà.


The resulting patch is 342 lines (226 lines added) and can be found from here. It should apply cleanly on top of gnome-terminal 3.6.2-0ubuntu1 (Ubuntu 14.10) and with fuzz on vanilla 3.6.2. Applying on top of master takes some handyman work.

Wiki with git backend

October 3, 2014 on 11:16 am | In geeky | No Comments

During a time void, also known as compile time among programmers, I started to play with an idea that had surfaced earlier in lunch time discussions with colleagues. The more I though about it, the worse the idea seemed, and the more I wanted to see how much work it would be. The original idea was that a git would make a great knowledgebase, giving nice diffs and logs and so on, but the less technologically savvy were a bit reluctant, as the magic black-boxes-with-gray-text are not what they consider user friendly. Hence a better interface was needed. And of course plugging a server side wiki engine to a git backend was too simple of a solution. No no no. Of course it had to be done in browser. And hence giki was born.

So, what exactly is giki? It is a slightly modified js wiki formatting engine, jquery-wikitext glued on self made git repo parser, utilizing inflate.js from zip.js. The result is a very simple read-only wiki, using a git repo over HTTP and doing all the magic in the browser.

Does it work? Sure it does. Is it useful? Not in the least. Why then? Because I could. If you're still interested, see it action (use view source to get to the source). I also used the same git blob parser to implement a simple repo browser.

Eating another hamburger

July 1, 2014 on 2:41 pm | In geeky | No Comments

Remember this hamburger I ate? No? It's alright, why would you. Anyway, the previous one has a big brother. One that is quite angry about what was done to the little one.

Well, foolish as I am, I decided to accept the challenge, and went head to head with the behemoth. Not only is the Ultimate Death Burger more laden with hot sauces, the rules are also more strict.

The first rule: only 20 minutes time to complete the meal. The burger is accompanied by a set of fiery french fries. This means that there is not a lot of room for breathers. Fortunately for me, I'm usually quite fast eater. I went through the fries in a bit shy 5 minutes, and continued onto the burger itself. The burger was quite smooth sailing until about halfway through, when the second rule started to play a role.

The second rule: only one pint of water allowed. After the halfway mark, my mouth felt like it was in flames, and I needed to quench the fire in between bites. Well, that was at first. But one bite later, my mouth would not agree to swallow the napalm like substance, and I needed to fool it with the water. Because water supply was so scarce, it was a delicate balance to have just the right amount. Nevertheless, to my great surprise, I managed to slay the beast, which is when the third, ultimate rule showed its head.

The third rule: Stay seated for 10 minutes. No extra water, no milk, and definitely no vomiting. The seconds on the timer were not running, or even walking. They made snails seem hasteful. Before the time was up, my fingers were numb, and I was generally disoriented. I had a pint of water ready and after what seemed like eternity I got to try to quench the fire within with it. Little use it was, and immediately after the water mixed with the hellish contents of my stomach, it quickly overthrew any opposition that was still holding onto the nutrition, and I quickly had to excuse myself.

So all in all the feat was successful. Painful, moot and stupid, but nevertheless a success. Unfortunately I didn't even get a t-shirt. Only my name on a sheet of paper.

Eating a hamburger

September 26, 2011 on 11:46 am | In personal | No Comments

Eating a single simple hamburger, sounds easy right? Well, at least in theory. But sometimes it is not quite that easy. The nightmares are slowly fading away, and I can share my experiences with one devil spawn of a burger.

The day started as a pretty good day. A friend was returning from a vacation, and made a trip with my wife to welcome him back. I called him on the way, and challenged him to eat a hamburger. Unfortunately he had too many excuses, but agreed to eat something else, while I feasted on the burger from depths of hell.

So we went to the restaurant, and I hesitantly placed the order of one "Death Burger," and lots of water. Not that the water would help anything, but you are not allowed to consume dairy products, if you wish to get the t-shirt. And after all, the t-shirt was what I was after.

Getting the burger revealed its horror. The smell of hot sauce was overwhelming. Had heard from multiple sources, that it was a good idea to start from the fries, so that's exactly what I did. While spicy, they were reasonably easy to eat, except that they made my tongue feel like it was being burnt. But after all, they were the easy part, especially after deciding, that it wasn't actually "temperature-burn", but only "spice-burn", I finished them quite quickly.

Trying to fight fire with waterAnd then there was the burger. Still smelling of hot sauce. And I don't mean tabasco hot, or texas pete hot. I mean evil hot. Ignoring the warnings of the smell, I dug into the thing. The first bite was not that bad, until the burn spread. I tried hogging in pieces, but after a few had to take a small break. Or maybe a bit longer. Tried to quench the burn with water, but obviously in vain. The clock kept ticking seconds away; after all there was only 30 minutes to finish it. Fortunately I had mates along, who kept me aware of the time. And one that kept saying that water is a mistake.

Empty plateAfter the break, which felt like infinity, and during which the burn only very slightly reduced, I decided to ignore all common sense, and just go with it. After all, I did want the t-shirt. And so I did, I took reasonably sized bits, and ate them as fast as I could. Ignoring any indications, that this is not human food, coming from mouth and stomach. And I kept on going, until there was not a bit left on the plate. And that's when the horror started. Fortunately the waitress was there quite quickly with a glass of milk, which was now allowed, as the beast had been eaten. But unfortunately it did not help much. So I went for the bar to order another one, which did not do much either. And then things took turn for the worse, or at least that's what I thought, and my stomach refused to hold it in any longer.

The next day, however, I realized that it was not a bad thing, but indeed a good thing, and I did not have to cope with the aftermath. And I did get the t-shirt. Huge thanks to my wife for understanding and encouraging. And also the the friend-with-excuses for taking the pictures. The bad quality is not his fault, the restaurant was quite dimly lit. And he did try to consume the burger later on, although despite his valiant efforts, failed. Next try in near future. =)

Now, the same place has this "Ultimate Death Burger", only served during special spicy food theme nights, and one you need to sign a waiver before they serve it...

Next Page »

Powered by WordPress.
Entries and comments feeds. Valid XHTML and CSS. ^Top^