Weblog Home Web site's Home Clip art Search Clker.com – Available downloads RSS

Recent Posts:

arrow image github a valuable opensource too...
arrow image Easy chrome extension to convert...
arrow image Updated svg edit - much better p...
arrow image Professional graphics designers:...
arrow image Refresh when drawing
arrow image That's impressive
arrow image Another quick feature added
arrow image Image editor now works on Intern...
arrow image Manually tracing cliparts out of...
arrow image Draw edit the SVG images online
arrow image Where to find good old public do...
arrow image Developing clip arts using old p...
arrow image Expected downtime tonight
arrow image Automatic vectorization is up an...
arrow image Ubuntu lucid lynx is extremely s...
arrow image Integrating the tracer
arrow image Automatic vectorization, coming ...
arrow image Updated clker interface
arrow image using ubuntu as my main desktop ...
arrow image Full time on clker
arrow image Clker will be down tonight
arrow image Tracing local and lineart images
arrow image Bad HD - clker will be down on S...
arrow image Updates to profiles
arrow image Crayon video - Marathon runner
arrow image A detailed tutorial on Crayon
arrow image Crayon image resources & tutoria...
arrow image Example tracing president obama
arrow image More on Crayon
arrow image First version of the online trac...
arrow image Now you can edit your uploads
arrow image Clipart tracer: New feature comi...
arrow image Apache hanging once in a while -...
arrow image Injuries from working on compute...
arrow image Restarting the server once in a ...
arrow image Writing my own webserver
arrow image Flex - Zooming in on large bitma...
arrow image Linux average load
arrow image Caching SQL with PHP
arrow image Wordpress plugin version 1.2 rel...
arrow image Fixing vote up/vote down icons
arrow image Google native client
arrow image New wordpress, looks great
arrow image Bug reports and feature requests...
arrow image Further flex testing
arrow image Some flex ideas
arrow image actionscript mode for emacs
arrow image Linux spirits crawling in chrome
arrow image Wordpress clip art plugin
arrow image Adding Clker.com to opera search...


Sites I like:
Arabic movie database

Archive for the ‘linux’ Category

Ubuntu lucid lynx is extremely slick

Sunday, April 25th, 2010

I tried ubuntu lucid lynx (10.04) on my  laptop and my first impressions it’s way better than 9.10. I have 9.10 now running on my desktop, and tried it on my laptop as well. When I tried to install 9.10 on my new laptop (hp dv4 series) the screen didn’t show anything. On my desktop, compiz does not work on 9.10. When I tried 10.04 it worked and all the hardware on my laptop worked as well. Still to try it on the desktop machine.

Other than the screen, the new gnome theme is very slick and together with compiz they are really way slicker than win7, which runs on the same laptop. I installed all my work on the ubuntu installation including gcc, emacs, vim …etc.

The installation was very smooth. I tried to look around on wubi’s website for a way to install 10.04 but I didn’t find. After some googling I found a link on Librarian Geek’s website.

Some draw backs: 1. happycoders emacs is no longer maintained. Ubuntu removed it from the release. 2. Flash 64 bit does not install correctly from adobe’s website. I got it to work by downloading the tar.gz file pointed to by Raman here. The one from adobe for some reason does not work, and the channel does not work as well.

Given it’s alpha status, I think it will be an excellent LTS release for ubuntu. For me, it’s a keeper on my laptop.

using ubuntu as my main desktop machine

Wednesday, February 3rd, 2010

When I was studying for my PhD at the University of Miami, I had Fedora installed on my machine as the primary os and switched to windows 2000 whenever I was forced to. I remember that was about 2002 / 2003 and I was able to get all my work done over there including posting lab notes to students, writing my papers and research reports on tetex and of course writing code. I don’t recall that octave was there, so I had to use windows to run simple experiments on Matlab.

Since I didn’t have a desktop machine, I decided to setup one. We had an old dell optiplex GX260 with an XP license and of course no media, which was bought within a lot of machines and so no hope to actually obtain the original media i.e. the license is just useless. The OS on the machine was totally damaged beyond repair.

I decided to install Ubuntu desktop and I tried both 8.04 and 9.10 (sorry I have hard time keeping track of the animal name). Amazingly, 8.04 flawlessly worked but 9.10 compiz had an issue with the on-board graphics card. I installed google chrome, flash player (which are not part of the ubuntu repository), and currently using gmail chat instead of gtalk. Hopefully I will try to setup pidgin to be able to chat in voice.

I always knew that I don’t need MS windows. My MS Windows is used to run openoffice.org and putty 90% of the time!!! Hopefully I will find a good graphics card compatible with 9.10 on newegg and will upgrade soon.

Linux average load

Saturday, January 31st, 2009

It was the first time yesterday that I took sometime to understand what is the “average load” that appears with the top shell command. It turns out that it is a very informative number, if used right.

Linux averages the number of processes running, or can run – but may be blocked waiting for I/O. The average is calculated three times: for a minute, 5 minutes and 15 minutes.

In case that no processes were blocked on I/O, this number should always be less than the number of available CPU cores. For insance, if the load average is 2.6 this means that in average 2.6 processes were running or wanted to run. If you have only two cores, this means that there’s in average 0.6 processes that cannot find an available CPU to run, which means that you need to upgrade your machine (or server).

Since this number grows when processes are blocked on I/O, it is nice to know whether there’s any blocked on I/O or not before going ahead and spending the last penny to buy new hardrware. The solution is a tool called atop. If the disks are stressed and cannot keep up with the requests (and consequently processes will be waiting for the disk) you will see the DSK line (or lines if you had more than one) in red.

At that point you should find who is the culprit, and do something to run that process in a better way.

Discovering some of Linux Cow Powers: awk and bash

Tuesday, July 22nd, 2008

If you’ve never used linux you should give it a try. The simplest way is to install colinux with xming. If you feel its involved, just burn an ubuntu 8.04 bootable cd. It works out of the CD with almost all the every day used features without installtion, formating or any other extra work.

One of the cow powers that I use linux for, and didn’t find a way to do it using the windows shell is renaming a lot of files. I had around 3000 bmp files, their names were in the following format:

<word>-<volume>-<number>.bmp

and I wanted to rename them to

<word>-<number>.bmp

I couldn’t find a way to do this on windows except with writing a small pogram. On linux the solution is a program called awk, which can tokenize input strings, and write parts of them out. awk is a sophistiacated tool, and I usually use a very small subset of its features.

The solution was finally something similar to this:

find .bmp | awk -F- '{print "mv "$0" "$1"-"$3;}' | bash -v

The first part is running the find program to find all files with extension .bmp. The results are then piped as input to awk, instead of being written to the screen. awk is told that the field separator is ‘-’ using the -F parameter. A one statement awk program says print the words mv to issue commands to rename a file, then $0 for the full string before tokenizing, then the $1 for the first tokenized part, and $3 for the third tokenized part, skipping the second. So awk is being used to write the commands that we need to execute.

Those commands are then piped to bash, the bourne shell with the verbose flag on so that we can see the commands being executed.

A good practice when trying something like that is to pipe the results first to less, instead of bash. less is a program like more, which gives the user the ability to inspect whatever was piped to it and filp forward and backwords. It also includes some nice features like -N to show numbers, and searching for a word using the forward slash /  , with commands very similary to vim.

So the less version of the line whould be this:

find .bmp | awk -F- '{print "mv "$0" "$1"-"$3;}' | less

This is a very handy technique in renaming lots of files, or carrying batch operations on multiple files that I usually use.

Technorati Tags: , , , , , , ,

GNUPlot wordpress plugin v1.1

Friday, July 18th, 2008

Plots GNUPlot charts without GNUPlot on your server. This plugin communicates with our custom version of GNUPlot hosted at clker.com, and responds with a PNG chart or errors in case of errors.

Write your GNUPlot code between [ gplot] and [ /gplot] (without spaces). Maximum chart size is 1×1.

To install

  1. Copy the file ( gnuplot plugin ) in you wp-content/plugins directory, and rename to .php instead of .phptxt.
  2. Create wp-content/cache directory, and make sure it is write able to the webserver
  3. Activate the plugin from the plugins tab inside wordpress

Example:

[ gplot]

set size 1,0.7
set dummy u,v
unset key
set parametric
set view 60, 30, 1.1, 1.33
set isosamples 50, 20
set title "Interlocking Tori - PM3D surface with depth sorting"
set urange [ -3.14159 : 3.14159 ] noreverse nowriteback
set vrange [ -3.14159 : 3.14159 ] noreverse nowriteback
set pm3d depthorder
splot cos(u)+.5*cos(u)*cos(v),sin(u)+.5*sin(u)*cos(v),.5*sin(v) with pm3d,\
1+cos(u)+.5*cos(u)*cos(v),.5*sin(v),sin(u)+.5*sin(u)*cos(v) with pm3d

[/ gplot]

would produce this:

… Enjoy

ionice and daily backups

Wednesday, June 25th, 2008

Anyone running a web server will either use a RAID for information protection, or will run a cron job that backups all the data daily

If you are running a budget server like mine, then most likely you’ll be doing daily backups. At 4 am everyday a backup process runs and stores a dump of the SQL and gzip of all the web folders including the SVN repository on a separate harddrive, which gets unmounted after the backup is done. The process takes around 2 hours to complete, during which the hard drives are all stressed and if the backup scripts were not carefully written they will hog any apache request due to slow disk access.

Because the primary reason of a webserver is to serve pages, I do two things to guarantee that any apache or database process will take precedence in resources when compared to the backup process:
1. The backup process runs with nice -n19 <backup>. This means that the process is running with a CPU idle priority, and will only use the CPU if no other process needs it. Using only nice will guarantee that the CPU is free, but programs like tar and gzip are big disk users and although they might not use a lot of CPU, a very small CPU usage is enough to generate big disk reads/writes. Large disk reads/writes usually will block other processes and keep them waiting for the disk resource, and will thus slow down apache, postgres and mysql and finally all pages accessed during the backup. In simple words, during backup if one tries to browse any of my websites when the backup process is only niced, the pages will load very slow.
2. This leads us to the second precaution, ionice -c2 -n7 nice -n19 <backup>. ionice is a tool that not so many people know that it exists. It basically prioritizes the disk access for different processes. Long tasks that access the disk a lot, and which are not that important or time critical should be given a lower priority compared to those that require fast turn around. ionice has three priority classes given by the parameter -c. Class 1 is idle priority, and only root can run processes in that class. Class 2 is best effort priority, within which there are 0-7 levels given by the -n parameter, 0 being highest priority and 7 being lowest. Class 3 is real time.

ionicing and nicing the backup process makes a huge difference in the speed and response of the webserver during the daily backup, especially if it was long. Try going over its man page. A good post about ionice is here.

Technorati Tags: , , , ,

Gatner “Windows is collapsing”

Friday, April 11th, 2008

So Gartner, one of the largest analysis companies said in a conference that windows is collapsing. Lots of reasons were given including a large code for windows that became un-maintainable and lead Microsoft to start from a more stable release of Windows 2003, and improve on for Vista. This in turn lead to Vista not really delivering meaningful enhancements from the point of view of the end consumer. Also, the fact that applications now are moving on the web and becoming OS agnostic where you can run many available software almost on all operating systems (openoffice for example, inkscape and others based on GNU’s compilers or java).

Other reasons included a hardware demanding OS, which can’t fit on small PDA’s and thus giving the chance for Linux and OS X. I add to this list, the complete prevalence of Linux in embedded OSs in appliances like Tivos, with zero competition from Microsoft.

A big major contribution from my point of view is the reason of making a new release. When a company decides to allocate resources for the development of a software, they need goals. The only goal that is obvious from the release of Vista was to duress people into DRM, using windows position as a must be OS on every PC or  laptop. A very important lesson that you can’t force the whole world into only your own vision, but you have to embrace what people want, make it your vision and may be add more on the side. After all people are the consumer. Unfortunately, that was the second time Microsoft did that, after they already suffered losses on the server side and hosting.

However, I do disagree with Gartner on the result that Microsoft or windows will crash. I believe that Microsoft is trying to embrace the standards and work with others, it’s just they’re not doing enough and lots of people doubt their intentions. My view is Microsoft becoming more like SUN or IBM where they will have to work with people and provide solutions for the problems that people see using ways that the people want.

Technorati Tags: , , , , , , , , , , ,

Creating a tar gz on the fly using PHP

Thursday, March 27th, 2008

A while ago, I thought about creating a tar.gz file for every download, so that if someone runs a search, he/she then can download all the images in the results. After a little bit of research, I found that PHP has a function for gzip. I also knew that the tar format just sticks files after one another, so if I can implement the tar format in PHP then I can gzip all images in the results.

I found this LGPL code that implemented the tar format. I used it (and modified it a little bit) to produce the online tar.gz functions:

  1. // Computes the unsigned Checksum of a file’s header
  2. // to try to ensure valid file
  3. // PRIVATE ACCESS FUNCTION
  4. function __computeUnsignedChecksum($bytestring)
  5. {
  6.   for($i=0; $i<512; $i++)
  7.     $unsigned_chksum += ord($bytestring[$i]);
  8.   for($i=0; $i<8; $i++)
  9.     $unsigned_chksum -= ord($bytestring[148 + $i]);
  10.   $unsigned_chksum += ord(" ") * 8;
  11.  
  12.   return $unsigned_chksum;
  13. }
  14.  
  15. // Generates a TAR file from the processed data
  16. // PRIVATE ACCESS FUNCTION
  17. function tarSection($Name, $Data, $information=NULL)
  18. {
  19.   // Generate the TAR header for this file
  20.  
  21.   $header .= str_pad($Name,100,chr(0));
  22.   $header .= str_pad("777",7,"0",STR_PAD_LEFT) . chr(0);
  23.   $header .= str_pad(decoct($information["user_id"]),7,"0",STR_PAD_LEFT) . chr(0);
  24.   $header .= str_pad(decoct($information["group_id"]),7,"0",STR_PAD_LEFT) . chr(0);
  25.   $header .= str_pad(decoct(strlen($Data)),11,"0",STR_PAD_LEFT) . chr(0);
  26.   $header .= str_pad(decoct(time(0)),11,"0",STR_PAD_LEFT) . chr(0);
  27.   $header .= str_repeat(" ",8);
  28.   $header .= "0";
  29.   $header .= str_repeat(chr(0),100);
  30.   $header .= str_pad("ustar",6,chr(32));
  31.   $header .= chr(32) . chr(0);
  32.   $header .= str_pad($information["user_name"],32,chr(0));
  33.   $header .= str_pad($information["group_name"],32,chr(0));
  34.   $header .= str_repeat(chr(0),8);
  35.   $header .= str_repeat(chr(0),8);
  36.   $header .= str_repeat(chr(0),155);
  37.   $header .= str_repeat(chr(0),12);
  38.  
  39.   // Compute header checksum
  40.   $checksum = str_pad(decoct(__computeUnsignedChecksum($header)),6,"0",STR_PAD_LEFT);
  41.   for($i=0; $i<6; $i++) {
  42.     $header[(148 + $i)] = substr($checksum,$i,1);
  43.   }
  44.   $header[154] = chr(0);
  45.   $header[155] = chr(32);
  46.  
  47.   // Pad file contents to byte count divisible by 512
  48.   $file_contents = str_pad($Data,(ceil(strlen($Data) / 512) * 512),chr(0));
  49.  
  50.   // Add new tar formatted data to tar file contents
  51.   $tar_file = $header . $file_contents;
  52.  
  53.   return $tar_file;
  54. }
  55.  
  56. function targz($Name, $Data)
  57. {
  58.   return gzencode(tarSection($Name,$Data),9);
  59. }
  60.  

To use those functions all you have to do is send a header with the mime type for the tar gz ( application/x-gzip ) using the php header function. To add a tar/gz section for a file, read the file in an array using filegetcontents and pass the filename and data to the targz function. Echo what is returned. That’s it!

So why is it not active on clker.com website? I actually tried it and found that compression consumes a lot of CPU. In the first 20 minute I had more than one hundred connections for different users downloading their results and the CPU was saturated. This basically left no CPU for searching. So use it carefully, and only if you really need that functionality.

Technorati Tags: , , , ,

Running your server is easy, fun but involved

Saturday, March 1st, 2008

I love using Linux to do my work. My best usage of Linux is my web server, although I recall I read once that Linus never intended for the kernel to be used as a server. He was more focused on using the kernel in desktops. I’ve been running my own web server for almost a year now, which runs two websites mibrahim.net my real estate website, and clker.com a to be online clipart website – we’re halfway there.

The fun part is simply everything just works. You’ll have all the tools you need starting from database engines like postgres, mysql to scripting languages like php, ruby with different types of webservers apache, lighthttpd and others. All the tools you might think of are there and under your own hand. Building your own server is not expensive – around $100 will do it. You don’t need a super quad core machine to produce extremely fast websites, unless you are already getting more than 50 page requests per second and at that point you will need something faster.

The performance bottle neck is never the CPU, it’s the hard drives read or write speeds. You can improve on that using fake RAIDs. Almost all Linux distros offer fake RAIDs and that is the cheapest way to improve the read speed.

Setting up your server is not a hard process. The best distributions that I recommend are Debian and Ubuntu. The reason is the very large library of software that comes with each. I believe that now the full distribution has grown more than 11 CDs. I used to run Debian and switched to Ubuntu a year ago and the reason behind the switch is the faster updates I get from Ubuntu, which enables me to use more recent and updated versions of PHP and the database engines.

The easiest setup is using the Ubuntu server CD, which is not any different from the desktop CD in terms of binaries. The only difference is that it won’t install the X11-server (GUI) and the window managers (gnome or kde) and the install program itself runs over the console and not VGA graphics. I use the server installation, and connect to my server using ssh. I have another old machine that runs Ubutu as well, and is used to run freenx. By that way I keep the server’s memory for the services running, and I can add all the GUI programs I want on this old machine.

Since I greatly benefited from running my own web server, I will share my experiences every now and then when I’ve got time to write.

Technorati Tags: , , , , , , ,


Fatal error: Call to undefined function akismet_counter() in /home/webs/www.clker.com/blog/wp-content/themes/mibrahim/footer.php on line 27