Filezilla: How to refresh a FTP folder automatically

25
2013-11
  • FiveO

    I'm using Filezilla to check if new files arrived in different folder on a FTP. But I always have to do a refresh of the folder manually with F5.

    When I click on a folder that I already clicked before, the refresh is not done.

    How can I enable Filezilla to always update(refresh) a folder when I click it?

  • Answers
  • Dennis

    With FileZilla 3, this isn't possible. Ticket #8111 is an open feature request that asks for an option to disable the caching of the directory listing.

    You have two options:

    • Install FileZilla 2.2.32.

      Disabling the directory cache is straightforward in FileZilla 2:

      screenshot

    • If downgrading is not an option, you can download, modify and compile the source code.

      The modification is easy. The file src/engine/directorycache.h of the FileZilla 3.5.3 source code contains the following:

      /*
      This class is the directory cache used to store retrieved directory listings
      for further use.
      Directory get either purged from the cache if the maximum cache time exceeds,
      or on possible data inconsistencies.
      [...]
      */
      
      const int CACHE_TIMEOUT = 1800; // In seconds
      

      As you can see, the default timeout is 1800 seconds (30 minutes). Setting the timeout to zero should disable the directory cache.

      Compiling is a lot more difficult. The official tutorial Compiling FileZilla 3 under Windows explains how.


  • Related Question

    permissions - FTP from windows to linux using filezilla causes doubling of file sizes?
  • Questioner

    Running filezilla 3.3.0.1 (and slightly older versions exhibit this behavior as well) to Red Hat Enterprise 5x with filezilla server, we are getting doubling of text files on overwrite. It seems to affect php, js, html files but maybe not binary files although that's not thoroughly tested. We've looked at the settings on the client and found one we thought was the problem which was 'allow resume of ascii files' which states in the client that if ticked can cause problems with line endings differing from platforms, but we've unticked this option.

    so here's what happens:

    We open the client, connect to ftp server. Upload a local file and choose 'overwrite' of remote copy. Filesize should change slightly (should slightly increase, ex: 117kB -> 118kB), instead it does not refresh the filesize. You hit the manual refresh button and suddenly the remote copy's filesize is reported to be doubled or more (ex: 275kB). What the heck is going on??

    When we redownloaded some of these files it was as if a concatenation was happening (appending random extra contents). Obviously this will not fly to micromanage files- our whole system could be screwed up completely by this. Please help!! Is this a permissions/ownership issue or something really weird happening with either the filezilla client or server?


  • Related Answers
  • Jeremy Morgan

    After talking to rackspace, it turns out it is related to a sticky bit and setguid issue they had set for us on the webroot folder that was allowing us to write to the file but not destroy it first so the net result was a concatenation/append of new contents to the end of old contents. We were trying to have the web server user be the owner and the web server group be the same group as the ftp-ers group so that there was an ownership and full access for apache and yet read-write permission for ftp users and read-only perms for world-readable documents to be served as outlined here: http://www.washington.edu/itconnect/web/publishing/permissions.html

  • Steve

    Do you have access to the Red Hat server? If so, what does the filesize show on that machine? If I had to guess it is probably a Filezilla bug, as a quick search turned up a bug report similar to what you experiencing here: http://trac.filezilla-project.org/ticket/4788.