[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: cybersitter und verwandte: mangelhaft



On Tue, 02 Oct 2001 00:25:28 +0200, Michael Plate wrote:
n'abend herr plate

kl>>Mistsoftware zu befassen
kl>>und öffentliche Gelder und öffentliche Arbeitszeit
kl>>damit zu verschwenden.
kl>>
kl>Nun... zumindest dem kann theoretisch abgeholfen werden. Auf
kl>www.squidguard.org/blacklist/
   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
kl>finden Sie eine Liste und am Ende auch zwei Links. Die Listen
sind
kl>kostenfrei, über deren Qualität kann ich allerdings nichts
sagen, da wir
kl>Positivlisten nutzen. Ob Sie die auch mit Windos nutzen
können, weiss
kl>ich allerdings nicht - wir nutzen Squid als Proxy.
kl>
kl>Ein Kollege nutzt diese Listen im Haus der Jugend und ist
zufrieden -
kl>"Durchläufer" entnimmt er dem Log und fügt sie nachträglich
hinzu, so
kl>dass diese zumindest das nächste mal nicht zugreifbar sind.

interessant. und wieviel zeit wird dafür aufgewendet?
ist das nicht sehr zeitaufwendig? logs anschauen? einträge
machen?
gibts da eine kleine einschätzung?


danke
k.l.


ps: hab mal nachgeschaut bei der o.g. url... :


-> New:

   The porn section of the blacklist has now more than 100.000
entries!!
   The robot is now rewritten from scratch (as of 2.0.0) and
takes about 7 hours to complete, which is
   a major improvement from the earlier version which took 36
hours to do a worse job.
   The blacklist is now split into subsections (porn, agressive,
drugs, hacking, ads, ...) to better match
   different needs. The filename is therefore changed from
blacklist.tar.gz to
   blacklists.tar.gz to reflect this. The new lists may be more
or less empty and incomplete in
   the beginning. Suggestions are welcome.
   The new robot uses BerkeleyDB with the same strategy as
squidGuard, though limited by the
   string/character/pointer-capabilities of Perl. the new robot
should handle exceptions and
   redundancy in a much better way than before.
   The new robot should also be much smarter in selecting the
right domain/directory level for
   blocking, thus making a more powerful blacklist with the same
number of entries.
   The robot now also resolves the host addresses and
automagically adds the corresponding in-addr
   versions to both the domain and url lists.
   The robot now also makes lists of new entries to simplify
revision between versions.
   Note for those who run the robot. The command line options
have changed. The robot is still
   undocumented (You'll have to read the perl code to see if you
can figure out what it does and how it
   should be configured. Thus the robot is not recommended for
use at the average site.
   The robot is now ran three times a week.

  Note: The blacklists are entierly products of a dumb robot. We
strongly recommend that you review
the lists before using them.
Don't blame us if there are mistakes, but please report errors
with the online tool below.

The online tool is under reconstruction (when I find time). In
the
meantime, send updates to blacklist _at__ squidguard.org..




  Alternative blacklists

Linugen
   A nice online blacklist database. Thanks to Tom Schouteden.
Université Toulouse
   A compilation of blacklists. Thanks to Fabrice Prigent.

-

Klaus Lehmann                              Adresse:(ab 28.6.01)
Admin Netware/WinNT Friedrichsh.-Kreuzberg Schleiermacherstr. 13
und allegro-C-Dienstleistungen.            D-10961 Berlin
-Datenbankbereinigungen, safer shells      fon  49-30-8950 3156
-Fehlerindices, Fremddatenimport/Export    mobil49-0171-9537843
-Novell Netware, WindowsNT-Server uvm.     fon 49-30-2977856122
 eMail: klehmann _at__ arco-online.de            fax 49-30-2977856128



Listeninformationen unter http://www.inetbib.de.