Saturday, April 11, 2015

Blekko has been acquired by IBM

Computer industry giant IBM apparently has acquired the search engine startup  blekko and shut down the website, only stating that blekko's technology will be integrated into IBM Watson.
This acquisition is eerily similar to Apple's deal with FoundationDB, an open source NoSQL database where all download sources were removed with the acquisition by Apple.

Blekko, which generated revenue through the sale of custom SEO statistics, used to donate vast URL collections to the non-profit Common Crawl project, which aims to develop open source search engine components and methods for efficient information retrieval.

As of now, there is no mention of the blekko acquisition on the Common Crawl website, however one might rightfully wonder whether URL donations are to continue.

Another search engine, Cuil, shut down in 2010, had donated crawl collections to the Internet Archive; however, the collections do not seem to be accessible as of writing.
Google announced plans to shut down Freebase, a knowledge base project it acquired  in 2010, within six months after December 2014.There is, however, an open successor project named BaseKB.

All these incidents should remind us that we currently do not have a large enough, open, search infrastructure and that work needs to be done to move forward in that direction, despite big companies laying their hands on open information and locking it up in the process.

Wednesday, March 18, 2015

Different dollar cost averaging strategies for investors

When comparing different investment strategies, the lump sum investment almost always emerges as the option that outperforms all others over time.
However, making a single large investment into a stock or fund is beyond the financial possibility of most of us.
Instead, we are forced to build our investment portfolio in increments.

This is where a strategy called dollar cost averaging comes in.
Dollar cost averaging (also know as unit cost averaging irrespective of the underlying currency) uses a fixed amount of money per month (or any other fixed period of time) that is deployed towards a particular stock.

In our simplified stock market, there are two asset classes: stocks that pay dividends and stocks that do not.
This article will look into dividend-paying stocks, because we want to earn income every month.
For stocks that do not pay dividends, the strategy is quite simple to explain:
once the shares in your portfolio have reached an all-time-high, sell them (or some of them); when the stock price has decreased enough, buy additional shares at a price below the selling price of your last transaction. Ideally, you should now be able to buy more shares at a lower price than the number of shares you sold when the price was at its maximum.

If the stock we own pays a nice dividend to us, whether this happens in quaterly intervals or, even better, every month, the case is a bit different.

Suppose that when we buy the initial position in the stock, its dividend yield is a sustainable 3.0 % per year. Now, even though we use monthly dollar-cost-averaging to buy shares when the price is low, over time the share price will go up.
This price appreciation has two primary effects:
- the dividend yield will decrease and
- the money locked up in the stock that results from the price increase above our initial cost does not earn interest.

We are now faced with a dilemma: on the one hand, we want to earn a nice stream of dividends; on the other hand, we want to make use of the money gained through price appreciation.

There are several strategies that can be used to implement dollar cost averaging:

1) The conventional strategy is to consistently buy the maximum number of shares available for a fixed amount of money each month. This results in an increasing amount of dividends paid to us every month, assuming that the company does maintain or increase the dividend, and never lowers or, worse, even cuts it.

This scenario does not allow you to take advantage of price appreciation.

2) Buying additional shares every time you have enough cash to add at least one share (transaction costs included) raises your share price to the point that, if you were forced to sell, you would probably incur losses due to the difference between market price and the risen average cost of your shares. This is not what we want;
transaction costs will eat you alive (at Charles Schwab, $8.95 minimum per trade; E*Trade $9.99 per trade)

3) A synthetic strategy to take advantage of DCA and at the same time minimize the number of transactions (and therefore transaction cost) is the following scenario: only buy new shares if the number of additional shares results in an increase in dividends that is equal or more than the transaction cost. This way, the new shares immediately pay for themselves.

4) the last strategy implements what we would use for stocks that do not pay dividends: sell high, buy low.
In the case of dividend-paying stocks, when the price has moved above a threshold, sell a number of shares that results in capital gains equal to the log utility of the sum of dividends you received so far from all your shares owned at the time immediately before the sale.

The difference in the logarithmic volatility over various time periods (depending on your horizon) could serve as a signal to iniate a sell or buy action.

Future articles will look into the mathematical foundations of the above-mentioned utility functions, risk aversion and behavioural finance.

A good analysis of DCA by The Vanguard Group:

Sunday, January 25, 2015

Pakistan power blackout

After terrorists destroyed a transmission tower in Baluchistan, Pakistan, nearly 80% of Pakistan's population - or more than  140 million people - were left without power for hours. (see Bloomberg article)

Just like India, which I mentioned in a post related to a power outage in 2012, Pakistan would hugely benefit from a distributed, resilient grid.

Sunday, August 17, 2014

FPGA bitstream documentation

This article contains a growing collection of resources on FPGA bitstream formats and toolchain details as I find out more about the various vendor's implementations.

Over the past couple of years, many projects have developed open source alternatives for proprietary solutions in the IT industry. One of the last areas of proprietary domination includes reconfigurable computing chips. This technology is becoming more and more important as the advantages of FPGAs over conventional processors in speed and energy efficiency become evident.
Even Microsoft is now testing FPGAs to accelerate their Bing search engine.

While there is a wide range of tools available to program the many flavors of microcontrollers like AVR, PIC and Parallella (which happens to be a completely open design), and to flash BIOS chips, similar capabilities are missing in the open source world for field-programmable gate arrays, or FPGAs.

Tools to develop and compile the necessary VHDL or Verilog code to be run on FPGAs are already available, such as Icarus or GHDL.
In order to use the compiled code with an actual FPGA, each vendor has their own tools:
Xilinx offers the free XSE Webpack, Altera seems to have at least a partly open source tool called STAPL (Standard Test and Programming Language), yet to flash the resulting binary code, further proprietary tools are needed.
Lattice Semiconductor, which offers a board to use with the Raspberry Pi computer, is doing the community a huge favor by providing access to affordable hardware. made an early attempt at bitstream analysis, however, the site is not online anymore. supports various bitstream formats as well as JTAG adapters, and is under active development. STAPL, however, is not yet supported by urjtag. For now, binaries and source code for a STAPL compiler and player available from Altera. references information on building a bitstream for Microsemi FPGAs. 

Routing of components on the FPGA chip is a complex task that is performed within the proprietary toolchains.
There currently are two open source projects that aim to implement route-and-place routines for FPGAs: RapidSmith and VTR (Verilog-to-Routing).

Rather comprehensive information on older Xilinx FPGAs can be found on the Internet Archive, though not on the manufacturer's site anymore:

Thursday, August 14, 2014

A year with the VAUDE Luke messenger bag

About a year ago, I was looking for a suitable replacement for my assortment of backpacks I had used until then to transport my laptop, folders, documents and other utensils I needed to have with me every day.
While browsing for an ergonomical solution that would allow me to carry my stuff when riding my bike in the city, I became aware of the disadvantage of backpacks I had been using for much of my high school years: heavy rain would soak them to the point where the content would sometimes get wet. This was something I wanted to avoid this time.
Soon I remembered that a friend had bought a messenger bag for uni, so I dug up some more information on them.
Of the various messenger bags I checked out, the VAUDE Luke L clearly stood out. It is made from very durable, water-proof tarpaulin, and comes with an adjustable should strap and removable pad.
The bag delivers a remarkable volume of 19 liters, which is more than enough for my daily needs.
One thing the VAUDE Luke makes superior when compared to other messenger bags, is the belt that stops the bag from moving around too much when you are riding a bike.
A padded compartment allows you to transport a laptop without having to worry about damaging it. On campus, I usually keep a laptop, a large, heavy folder, books and a Nalgene bottle all in the messenger bag.
Another unique feature is the handle that allows you to carry the Luke L instead of using the shoulder strap.
The bag can also be used for travel, can be attached to the handle bar of a trolley and, at 37 x 48 x 14 centimeters, is compact enough to be used as cabin baggage.
On occasion, I have used the VAUDE Luke L on a city trip, packing some clothes and little more than a toothbrush along with my laptop.

So far, I could not be happier with my choice of the messenger bag over a conventional backpack.

Monday, August 4, 2014

Document processing with Ghostscript

Quite often, printing a document in full color mode is not only unnecessary, but also a waste of ressources.
In order to save ink, time and money, you can simply use the draft mode available in your printer menu. The downside of this method is that you cannot preview your document in draft mode. The preview will always give you a full-color version, regardless of printer settings. At least for Linux systems you have the option of creating a grayscale version of the document and saving it to a PDF file; however, it will again only be a full-color version. This makes it necessary to process the PDF file using additional tools so you can get a proper grayscale version of your document.

In this post, you will learn about the tools to help you customize your document so it uses minimal ink, yet is clearly readable and can be checked before you waste paper and ink on a botched print.
To begin with, we will use Ghostscript. Binaries for Linux, Windows and Mac can be downloaded here

Using any shell on Linux, use the following to command to process your file (let's call it input.pdf for now) and turn it into a new file named result.pdf.

 gs -sOutputFile=result.pdf -sDEVICE=pdfwrite -sColorConversionStrategy=Gray
 -dProcessColorModel=/DeviceGray -dNOPAUSE -dBATCH input.pdf

These series of GhostScript commands take your PDF file, convert it to grayscale using a predefined strategy and generate a new PDF file that can be printed as-is. Note that not all command line options I am using here are fully documented by Ghostscript. This initally was what made me write this article: to help others learn about advanced options that are available but hard to find in the Ghostscript tool.

When generating a PDF from a Word document that was created using a version of Microsoft Office 2010 or later, please be aware that some features can cause GhostScript to fail at converting certain or even all document sections.

Tuesday, April 2, 2013

Need to make a tough decision? Don't think in your native language!

This is something that many of you probably had noticed subconsciously before, but only now a  new study published in the Psychological Science magazine shed some more light onto it. (Sorry if I'm not linking directly to it, the paper is not openly available)
Thinking in a language other than your native language improves decisions,
Wired sums up the ongoing research. In part, it is because thinking in a foreign language breaks up your habitual thought process that has manifested itself through cultural influence etc. over the years.

The method of using another language for thought can also be used when writing, even if you are just jotting down a list of groceries you need to shop for. Using the foreign language requires you to think more thoroughly about whatever you are going to do, which results in a better decision, according to the experiments done by the researchers.
Even more profound effects are evident when it comes to personality: immersing oneself in another language has the power to change certain personality traits while using the foreign language.