sjh - mountain biking running linux vegan geek spice - mtb / vegan / running / linux / canberra / cycling / etc

Steven Hanley hackergotchi picture Steven
Hanley

About

email: sjh@svana.org

web: https://svana.org/sjh
twitter: https://twitter.com/sjhmtb
instagram: https://instagram.com/sjhmtb

Other online diaries:

Aaron Broughton,
Andrew Pollock,
Anthony Towns,
Chris Yeoh,
Martijn van Oosterhout,
Michael Davies,
Michael Still,
Tony Breeds,

Links:

Linux Weekly News,
XKCD,
Girl Genius,
Planet Linux Australia,
Bilbys,
CORC,

Canberra Weather: forecast, radar.

Subscribe: rss, rss2.0, atom

November
Mon Tue Wed Thu Fri Sat Sun
       
22 23 24
25 26 27 28 29 30  

2024
Months
NovDec

Categories:

Archive by month:

Fri, 09 Dec 2011

My software works too well, change it back - 10:23
I have upgraded a few of the systems at work recently to a far more recent image, this one based on feisty (users get to choose what environment they log in to though (kde, gnome, something else, etc)). A short while after putting the image on James' desktop he wandered over and asked if I had doubled the size of the swap partition. When I said that had not changed he was almost amazed as he said around half the memory used before the upgrade was now in use.

It appears the profiling and lower memory foot print work various gurus in the kde and gnome and similar camps has paid dividends as there appears to be a pretty big drop in usage and memory leaks here and everything feels a bit faster all of which is good news. Not that I have done any real testing but perceived feel is relevant to some extent in a computing environment.

The most amusing thing here I thought was my interpretation of how he asked the question, it sounded almost as if something was wrong. As if James was saying "my computer is not using enough memory, and is running to fast, fix it, make it as slow and hoggy as it used to be". I guess at least he was not about to request a change to a computing system that seems to constantly get slower and more user unfriendly with every major release.

[/comp/software] link

Fri, 16 Jul 2010

Today's strangely named Debian package - 16:25
I was looking through Debian packages that have something to do with image analysis and see what code there was out there for working out meta information about images. One that showed up when looking at python itk bindings was perlprimer.

This definitely sounded odd as the package name suggests it is some sort of perl instruction package. When I looked at the output of apt-cache show perlprimer I thought it even stranger. In the description is the following "open-source GUI application written in Perl that designs primers for standard Polymerase Chain Reaction (PCR), bisulphite PCR, real-time PCR (QPCR) and sequencing.".

So this is in fact a genetic research related package, with the name perlprimer (it is admittedly written in perl). I know Debian packages tend to be named on a first in first named basis, however this definitely strikes me as deceptive/strange. Obviously all mad gene scientists are out there trying to hide their work with deceptive package names... or something.

[/comp/software] link

Wed, 30 Sep 2009

Rockbox freezes on ogg files fix - 17:41
After messing around with my ipod for a while upgrading it to rockbox v3.4 to see if it would play some ogg files it was freezing on I discovered a simple fix at last. If an ogg file has id3 or id3v2 tags it appears rock box will refuse to play the file. I had been wondering for a year or so why my violent femmes albums I ripped onto the ipod would not play, however was not to fussed as I was not listening to them overly much. However I ripped a new album I bought and was most annoyed to find I could not play it.

Happily now I have discovered this problem I have easily removed all the id3 and id3v2 tags form ogg files on the device with "find -name '*.ogg' -print0 | xargs -0 id3v2 -D" and hey presto I can now play all these files again easily. The ogg/vorbis tag remains intact, for some reason I had add id3 tags ticked in grip without restricting it to files ending in .mp3.

[/comp/software] link

Tue, 08 Jul 2008

How to capture one image from a v4l2 device - 17:22
So after seeing Mikal wondering about it again yesterday, I had a look at some source code, decided that it could be done but it would be nicer to do it with existing software. I recalled seeing ffmpeg or mplayer commands that may in theory be able to do a capture of a single image. Then I stumbled upon a way to do this with gstreamer filters and sinks.

"gst-launch-0.10 v4l2src ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace ! pngenc ! filesink location=foo.png"

As one command captures the image at that resolution into a file foo.png. This is on my laptop, however I tested this with the QuickCam 9000 on my desktop with a resolution of 1600x1200 and it worked, the focus meant it took a while but it popped out a good image. Gstreamer really is cool, I still remember seeing Federico talk about GMF (Gnome Media Framework, which is what became GStreamer) at CALU in 1999 and being excited by it.

[/comp/software] link

Thu, 25 Oct 2007

A wireless scanning tool - 19:26
I just wasted about 15 minutes trying to find online the name of the program I have installed on my laptop that I regularly (though not for a few months now) use for scanning for wireless networks.

Hopefully I can remember this post and look it up, the tool in question is swscanner (a kde wireless scanner applications).

[/comp/software] link

Fri, 05 Oct 2007

No count-words-region or similar in emacs? - 14:10
I have no idea how I never noticed this before, I was writing something a few minutes ago and wished to know how many words were in a section of it. Plain text in emacs. I tried Meta-X count-<tab> and a few variations and could not find a command that would count the words in a region of text, or a buffer or anywhere else. Strange I thought and decided to search online.

From search engine results I found that somehow emacs does not natively have the few lines of lisp required to do this seemingly simple function anywhere by default. So there are some reasons this may be the case, the first of which is the definition of what constitutes a word may be in question, especially in different modes. However I just want a basic text mode word count capability.

Many online suggestions seemed to launch a sub-shell and run wc on a buffer or section of a buffer, this is obviously overkill. Fortunately one of the first search results is to an elisp intro that has a section detailing a function defined to do count words region, which is exactly what I needed, so it is now in my .emacs file.

The two things I find most surprising with this state of affairs are: 1. emacs does not have the capability somewhere in the huge amount of elisp distributed with it to do this natively and 2. Though I have been using emacs a lot for more than 10 years I never before noticed this was lacking.

[/comp/software] link

Wed, 12 Sep 2007

Google maps API is kind of neat. - 11:31
I purchased a Garmin Forerunner 305 a few weeks ago, this is a combined HRM and GPS device. Pretty much aimed at sports people as a training tool. Mikey and Tony from ozlabs have been working on some code (gpsruns) that easily grabs the data and uploads it to your website to interface with the Google Maps API.

For example here is the API and Maps link from an 18 KM run I did last night. I wear it cycling, paddling and running and it is interesting to see the data. However I have been thinking there are ways to represent some of the data in the graphs in more interesting ways over time. I had a look at the Google Maps API documentation yesterday and am impressed with how much you can actually do.

I was thinking it would be cool to be able to display information such as distance, HR, speed, direction and other things in the line plotted on the map. Looking at the PolyLine documentation I am happy to see it can be done. I will need to divide the plot into sections over whatever range of change I want to display. However I can for example put a key for what colour is what heart rate on the page then display the map changing colour for different heart rates over time during the exercise. I can also put up more plot points for displaying distance covered or speed or gradient changes in different colours. I guess it is time I got hacking on this code along with Mikey and Tony.

[/comp/software] link

Wed, 18 Jul 2007

Far less painful than expected - 16:47
I asked Mikal if he had any experience in burning copies of DVD's, notably where there is more than 4.7 GB of stuff to fit onto a a dvdrw disc. He said no and asked me to report on the details of all the pain and suffering I went through to make it happen.

I feel almost cheated, and I am sure Mikal will be sad to hear how easy it all was, however because he asked, here are the details.

The only caveat with this method is it appears many (possibly all) of the extra features on the dvd will not work from menus (and looking at the mounted iso image may have been removed). However it has copied the primary documentary that is the reason for owning the dvd across, the resolution is reduced in parts, but that will probably not be particularly noticeable on a tv screen.

The simple process used for this is "apt-get install dvd95 vamps ; dvd95 &"

The dvd95 program will even burn the iso image it creates for you, or you can ask it not to and burn it yourself with growisofs or similar. I think the next step if I am feeling keen is to read up on ways to split the dvd up into two, retaining the menus but having the movie on one and the extras on another dvd or some other way to retain all the extra features. Right now however I do not feel that need.

[/comp/software] link

Mon, 04 Jun 2007

A Google mail complaint - 20:46
So like all the other sheep in the world I do have a Gmail account, though I do not use it for anything much or tell people what it is. However a while ago I set it up such that it will forward all email received to my main email address. Then I subscribed the account to a mailing list I found was not delivering to my normal address correctly for some reason.

So far all of this sounds fine, however I noticed over the past week there were a fair few email on the list I seem to have missed. I logged into Gmail and found it had rather nicely stopped a bunch of spam. However it had also stopped 42 list email in the past month or three. So I went through all the spam it had stopped and marked the list mail as not spam and thus they were moved into the inbox.

Now I thought to myself I simply have to forward (or bounce) all these 42 email to myself (there is no option to reprocess them with the default forwarding rule). Unfortunately this can not be done, there is no way to mass forward or bounce email to another location. Sure I could open every individual email and forward them, but that would take forever, and I admittedly would prefer to bounce them to me so the headers remain as they should be (bounce being a feature Thunderbird also does not have even though there have been open bugs against Mozilla mail since Mozilla was open sourced, but that is another rant for another time (I am aware there have been thunderbird plugins to do this sometimes but they tend not to be up to date)).

So looking through the help files for Gmail I find they are serious that there is no way to get more than one email at a time sent on. They suggest enabling pop3 and downloading the mail. Okay so I can do this, however upon trying it is about to download all the email that has ever come through to Gmail, not just the stuff in the inbox. I only want a local copy of these 42 email, if only it were not so hard. I guess I have heard of API's for Gmail that may be the next place to look.

Admittedly I use mutt as my primary email client and am not at all familiar with Gmail so I may be missing something but so far my rather specific needs are hard to come by. I guess at least I do have access to all my email data there rather than it be closed off and locked away somehow.

[/comp/software] link

Thu, 10 May 2007

Recovering data from a dbx file - 17:10
Maybe this should have a Dear Lazyweb heading?

So I have been trying to extract some email from a Microsoft Outlook Express 6.0 DBX file for a friend. She has deleted a lot of email in a mailbox by accident. However the email is all still in the file, however there is no way I can find to get it out cleanly.

Running strings over the dbx file it finds all the old email, though in a corrupted sort of output. There are some dbx libraries for linux and they have programs to readdbx or similar (perl libraries based on them). However running these it extracts the email that still shows up in the mailbox in outlook, but not all the deleted content. The DBX file is over 5 MB, however the available linux dbx libraries extract about 120 KB of data. Strings output is close to the 5 MB (the attachments, due to being base64 encoded of course are recognised as strings)

I wonder if anyone knows of linux software that can extract all the email from a dbx file even those with the leading few bytes or whatever outlook changes to indicated they do not exist any longer?

The best option I can find so far that may possibly work, though I have no idea what it can do is a utility called DBXtract that runs on windows for USD $7. It would be nice to extract this to mbox format on linux though.

[/comp/software] link

Sat, 15 Jul 2006

Mythtv manual record problems - 20:54
There have now been two really strange things I have seen with my mythtv setup and manual recording schedules.

Yesterday evening I was glancing over the upcoming recording schedule. I had set it to record the Tour de France highlights show every evening at 6pm on SBS, this has been working fine for the last week and a half since I set it in there initially. However for some reason the Saturday and Sunday evening sessions were not in the schedule. No idea why, anyway when I noticed this I deleted the lot and then made another manual recording telling it to tape every 6pm half hour slot from tonight onward and it was in there fine for as long as it should be.

Tonight I saw something even stranger, while the mythtv box was recording Dr Who (according to the status screen it said it was currently recording that) I did an ls in the directory all the recordings are stored in. I saw the 1800 file from the tour highlights (half hour show) and then I noticed there was no new Dr Who file (1930 for an hour) being recorded. I did the ls while the show was being recorded, so I tried a ps auxw and noticed the tuner was indeed doing something as the [cx88 dvb] kernel process was there. For some unknown reason the damn software did not actually save the file to disk. After the show finished I had a look at the upcoming schedule and I notice it had removed all future recordings of Dr Who from the schedule (every Saturday at 19:30 for an hour).

I have no idea why this is happening, Paul Wayper has suggested I should put the effort into ensuring the guide data stuff works and is tied to channels so recordings can be done through that rather than simply requesting a recording at some given time. However it is somewhat strange to see manual recordings playing up in this manner. For now I will simply have to be careful and regularly check the recordings I request are in there and hope they all actually get written to disk.

[/comp/software] link

Wed, 10 Aug 2005

I'm sorry Dave I can't let you do that - 15:44
I just saw a freshmeat announcement in my blog reader for libnodave. I wonder how much the person naming the library was giggling at "2001: A Space Oddessy" when they named this library?

[/comp/software] link

Extra goodness with disc-cover - 14:49
So I have been using the program disc-cover (online disc-cover server for those unfortunate enough not to have unix on their desktops) for about 5 years now. I keep a copy of all my cd's I listen to in the car rather than the originals as I would be heart broken if the car were broken into and I lost all the original albums. Also it means I can keep the originals in the house.

disc-cover looks at a cd in the drive does a cddb/freedb lookup and produces a postscript (or other format) output that can be used in the jewel case. Anyway yesterday I received some new cds (I will blog about this later) and the artists are not well enough known to have listings in freedb. Grip and other programs thus displayed unknown and such. I wanted to generate a cddb entry to place in my ~/.cddb/ directory that these programs could use. Freshmeat and google did not produce much of interest for outputting the correct file. Much to my delight after glancing at the disc-cover man page I was reminded that one of the output formats available is cddb.


disc-cover -n -t cddb

Generates the hex string used to name a file and the file Artist_-Disctitle.cddb which I was able to rename into my ~/.cddb directory and all works fine. I suppose I should look at uploading the album details to freedb, however last time I looked about 5 years ago uploading to freedb was a non trivial task, though I notice there appear to be applications that can do that more easily now days.

Anyway disc-cover has once again proven to be damn useful.

Update: I notice on the FreeDB FAQ they mention grip as an application that may be used to submit new track information. Now I have valid labels and grip has the information I should be able to work out what button to press to do this. (I use grip as my cd playing/ripping application).

Update2: So reading the FAQ closer it does appear though they tell you there to use some application I should be able to send the updates in a basic email, copy the entry generated by disc-cover add some items such as DGENRE and DYEAR, as this will be the first upload leaving the revision at 0 will be fine and all should work. I worked this out simply by looking at some existing entries for albums I already have cddb files for.

Update3: The first submissions I tried had blank subject lines, I noticed the developer link on the freedb site and found a Submit new entries document for developers of software, it appears that is the only thing my first submission attempts were missing. Anyway it is good to see that document there I do not recall finding it when I last looked at submitting track listings.

[/comp/software] link

Mon, 25 Jul 2005

Faster exif - 21:57
So I notice Michael is trying to use ImageMagick to extract exif tags and he finds it kind of slow. Strange that, using perl and yet forking processes to learn stuff turns out to be slow. So I had a look through apt-cache search exif perl output and found the libimage-exif-perl library, not native perl, there is c code there to make it go faster, based on the exiftags utility. This is likely however to be a lot faster than forking a process and relying on the implementation in ImageMagick being fast.

I installed the library, had a look around for a photo I was unlikely to have modified at all (and thus would still have all its tags) and found my 2004 Triple Tri photos. Using the following snippet of perl code.

time perl -e 'use Image::EXIF; use Data::Dumper; my $exif = new Image::EXIF \
("img_0869.jpg"); my $all_info = $exif->get_all_info(); print $exif->error ? \
$exif->errstr : Dumper($all_info);'

I get the following output.

$VAR1 = {
          'unknown' => {
                         'Min Focal Length' => 'num 24, val 0x00AD',
                         'Focal Units/mm' => 'num 25, val 0x0020',
                         'Manufacturer Notes' => '942',
                         'Canon Tag1 Unknown' => 'num 06, val 0x0000',
                         'Interoperability IFD Pointer' => '1540',
                         'Comment' => '614',
                         'Canon Tag4 Unknown' => 'num 01, val 0x0000',
                         'Canon Unknown' => '1448',
                         'Flash Activity' => 'num 28, val 0x0000',
                         'Max Focal Length' => 'num 23, val 0x0207',
                         'Autofocus Point' => 'num 14, val 0x0000',
                         'Canon Tag4 Offset' => '1224',
                         'Supported FlashPix Version' => '808464688',
                         'Flash Details' => 'num 29, val 0x0000',
                         'Unknown' => '1600',
                         'Canon Tag1 Offset' => '1116'
                       },
          'other' => {
                       'Vertical Resolution' => '180 dpi',
                       'Canon Tag1 Length' => '92',
                       'White Balance' => 'Auto',
                       'Exif Version' => '2.20',
                       'Resolution Unit' => 'i',
                       'Focal Plane Res Unit' => 'i',
                       'Image Digitized' => '2004:11:20 17:58:35',
                       'Canon Tag4 Length' => '68',
                       'Shutter Speed' => '1/807 sec',
                       'Focal Plane Vert Resolution' => '7741 dpi',
                       'Image Generated' => '2004:11:20 17:58:35',
                       'Bytes of JPEG Data' => '5141',
                       'Metering Mode' => 'Pattern',
                       'Chrominance Comp Positioning' => 'Centered',
                       'Compression Scheme' => 'JPEG Compression (Thumbnail)',
                       'Horizontal Resolution' => '180 dpi',
                       'Image Type' => 'IMG:PowerShot A60 JPEG',
                       'Digital Zoom Ratio' => '1',
                       'Offset to JPEG SOI' => '2036',
                       'Image Compression Mode' => '3',
                       'Digital Zoom' => 'None',
                       'Sequence Number' => '0',
                       'Focal Plane Horiz Resolution' => '7766 dpi',
                       'Flash Bias' => '0 EV',
                       'Base Zoom Resolution' => '1600',
                       'Meaning of Each Comp' => 'Unknown',
                       'Self-Timer Length' => '0 sec',
                       'Zoomed Resolution' => '1600',
                       'File Source' => 'Digital Still Camera',
                       'Owner Name' => '',
                       'Exif IFD Pointer' => '196'
                     },
          'camera' => {
                        'Firmware Version' => 'Firmware Version 1.00',
                        'Camera Model' => 'Canon PowerShot A60',
                        'Equipment Make' => 'Canon',
                        'Lens Size' => '5.41 - 16.22 mm',
                        'Maximum Lens Aperture' => 'f/2.8',
                        'Sensing Method' => 'One-Chip Color Area'
                      },
          'image' => {
                       'Vertical Resolution' => '180 dpi',
                       'White Balance' => 'Auto',
                       'Contrast' => 'Normal',
                       'Rendering' => 'Normal',
                       'Compression Setting' => 'Fine',
                       'Image Height' => '1200',
                       'Image Orientation' => 'Top, Left-Hand',
                       'Color Space Information' => 'sRGB',
                       'Macro Mode' => 'Normal',
                       'Focus Mode' => 'Single',
                       'Exposure Mode' => 'Easy Shooting',
                       'Exposure Time' => '1/800 sec',
                       'F-Number' => 'f/2.8',
                       'ISO Speed Rating' => 'Auto',
                       'Image Width' => '1600',
                       'Scene Capture Type' => 'Standard',
                       'Image Size' => 'Large',
                       'Drive Mode' => 'Single',
                       'Lens Aperture' => 'f/2.8',
                       'Sharpness' => 'Normal',
                       'Metering Mode' => 'Evaluative',
                       'Horizontal Resolution' => '180 dpi',
                       'Shooting Mode' => 'Full Auto',
                       'Image Number' => '108-0869',
                       'Saturation' => 'Normal',
                       'Flash' => 'No Flash, Auto',
                       'Image Created' => '2004:11:20 17:58:35',
                       'Focus Type' => 'Auto',
                       'Flash Mode' => 'Red-Eye Reduction (Auto)',
                       'Focal Length' => '5.41 mm',
                       'Exposure Bias' => '0 EV',
                       'Subject Distance' => '2.720 m'
                     }
        };

real    0m0.068s
user    0m0.034s
sys     0m0.002s

All this, including the time to load the libraries (Data::Dumper and Image::EXIF), the perl interpreter, the image file from disk. Executing on my 1.4 GHz laptop. Admittedly that was the second time I ran the code snippet, though I used a different image filename, so the libraries and interpreter were likely already hot in memory, it may blow out to all of 0.1 of a second if it has to do all that.

Use native perl and available libraries to make things fast, it makes a lot of sense. It seems I often think of the perl bridge building quote in this sort of situation. Of course it kind of sucks if you are trying to write a book about using ImageMagick and find you have to use other tools because it is not as fast as you want or something.

I notice Brad (the guy who started livejournal) often gets heavily into making perl go fast, worth reading sometimes just to make you think about what he is doing.

[/comp/software] link

Fri, 29 Apr 2005

tcpdump into remote ethereal? - 21:32
So yesterday I was debugging a network thing and needed to run ethereal on a machine upon which I did not wish to have it installed. Thus the normal way to do this would be use "tcpdump -w somefile.tcpdump -s 1500 -i ethN not port 22" or similar and have the entire packets being dumped placed somefile.tcpdump, copy the file to a machine with ethereal installed and look at it there.

I think that is a bit of a pain in the arse to do, so I was thinking it would be neat to be able to run ethereal directly on the output coming back over a network link.

My initial thought to try this was to use netcat and output tcpdump over the wire that way. Something like "tcpdump -w - -s 1500 -i ethN not port 22 and not port 3000 | nc otherhost 3000" then on otherhost I could try typing "nc -l -p 3000 | ethereal -r -". So I tried that and ethereal balked at reading from stdin. The next one to try was a fifo, so using "mkfifo etherealdata ; nc -l -p 3000 > etherealdata" and running ethereal and telling it to open that file. However though I have not looked closely it appears ethereal probably tries to mmap files or read them all in at once or similar, thus opening a fifo just wont work.

Looking at the start capture option in ethereal there is currently no way to capture actively on anything but an ethernet device. I am thinking maybe ethereal needs a patch to be able to start and stop captures on some given file handle, ignoring the data on that filehandle at other times, and thus make it easy to capture on stdin or similar.

Of course there may be another solution to this I have not thought about yet. I notice over the last year or so I really have not done much in the way of cool or fun geeky stuff, I think maybe I should do some more fun geeky things again. Maybe this can be a gentle start back into it.

[/comp/software] link

Mon, 28 Mar 2005

Faster directory reading - 20:13
I suppose if Jeremy is responding to some of Mikal's perl it will not hurt to also.

Mikal asks if there is a faster way to read a directory than using open and ls and stuff. This reminds me of a buttload entry a bit

<Schwern> Are we using perl to generate a shell script?
<Schwern> Its like building a bridge across a canyon so you can tie a rope and cross with that.

Anyway with perl and TMTOWTDI the answer is yes you can do this faster. Michael, the simplest way is simply using opendir and readdir, if you look at the documentation in "perldoc -f readdir" you can probably see an example. Using native perl is always going to be faster than launching a separate shell (which you probably realise, explaining why you wondered if there is some faster way).

Of course looking at the things you have been asking today I wonder if the perl module File::Find may help you out with doing some of the stuff you need.

[/comp/software] link

Fri, 11 Feb 2005

Strange behaviour from liferea - 22:47
I have been using the liferea aggregator for a while now to read news feeds on my laptop. Tonight I noticed my laptop hard disk was doing a disk access every 5 seconds. As I did not have much software open I found this a surprise. In the end I tracked it down to liferea.

Wondering what was causing it I straced the program

[22:35:10] 101 oneiros sjh ~>ps auxw | grep liferea
sjh 26098 0.1 1.9 49256 12416 pts/6 Sl 22:21 0:02 /usr/bin/liferea-bin
..
[22:35:19] 102 oneiros sjh ~>strace -tt -p 26098 -o /tmp/sout
Process 26098 attached - interrupt to quit
Process 26098 detached
[22:35:59] 103 oneiros sjh ~>egrep -v 'ioctl|gettimeofday|poll' /tmp/sout | less

Which gets rid of the stuff it is doing a lot of and leaves me with the remaining system calls and the time at which they happened. Looking at this there is something obvious happening every 5 seconds that would indeed cause a disk access.

22:36:04.928666 mknod("/home/sjh/.liferea/new_subscription", S_IFIFO|0600) = -1 EEXIST (File exists)
22:36:04.928740 open("/home/sjh/.liferea/new_subscription", O_RDONLY|O_NONBLOCK) = 13
22:36:04.928783 read(13, "", 256)       = 0
22:36:04.928812 close(13)               = 0
...
22:36:09.928930 mknod("/home/sjh/.liferea/new_subscription", S_IFIFO|0600) = -1 EEXIST (File exists)
22:36:09.928992 open("/home/sjh/.liferea/new_subscription", O_RDONLY|O_NONBLOCK) = 13
22:36:09.929036 read(13, "", 256)       = 0
22:36:09.929065 close(13)               = 0
...
22:36:14.929187 mknod("/home/sjh/.liferea/new_subscription", S_IFIFO|0600) = -1 EEXIST (File exists)
22:36:14.929253 open("/home/sjh/.liferea/new_subscription", O_RDONLY|O_NONBLOCK) = 13
22:36:14.929296 read(13, "", 256)       = 0
22:36:14.929324 close(13)               = 0
...
22:36:19.929446 mknod("/home/sjh/.liferea/new_subscription", S_IFIFO|0600) = -1 EEXIST (File exists)
22:36:19.929502 open("/home/sjh/.liferea/new_subscription", O_RDONLY|O_NONBLOCK) = 13
22:36:19.929544 read(13, "", 256)       = 0
22:36:19.929572 close(13)               = 0

It is a named pipe, which could change the behaviour of disk activity but still, under normal circumstances the following would apply.

Because it closes the file every time the disk does a sync. I had a look through the config options for liferea and can not find one to tell it to stop doing that, of course I have the option to look at the source and find out why all this is happening, but for now I think I will just stay annoyed at it. Unless anyone knows how to stop this behaviour?

Update: As mentioned the Liferea emailed me and told me how to disable these updates and it has stopped the disc accesses.

There is a gconf option to disable the checking of this pipe.

If you want to do so set the boolean gconf key /apps/liferea/disable-subscription-pipe

I started up gconf-editor and hey presto there are more configuration options there, and this one worked. Thanks Lars.

[/comp/software] link


home, email, rss, rss2.0, atom