Page 6 of 9 FirstFirst ... 45678 ... LastLast
Results 76 to 90 of 121

Thread: nzbget - binary newsgrabber

  1. #76
    Quote Originally Posted by methanoid View Post
    Come on Neilt0, you've had several hours. Where is that guide??
    I'm at work today, so not now. Maybe later. Try it yourself though, it's not that hard.

  2. #77
    Quote Originally Posted by neilt0 View Post
    I'm at work today, so not now. Maybe later. Try it yourself though, it's not that hard.
    I downloaded the web i/f and the docs made it look too difficult for a lamer like me!

  3. #78
    Quote Originally Posted by neilt0 View Post
    I'm trying 000 now and will report back.
    Works!

    Note that with the version on the optware feed, you can open folders and files from another machine, but can't delete.

    Only with the latest revision on sourceforge can you delte files,

    Thanks hugbug!
    Last edited by neilt0; 21-06-2008 at 18:54.

  4. #79
    Quick guide for nzbget on a LinkStation Live or Pro.

    Methanoid, try it out if you like -- at your own risk. Make backups etc. blah blah.

    If you spot any mistakes, let me know.

    Please do not post this guide anywhere else without my permission. Thanks.

    First: read all of http://nzbget.sourceforge.net/

    Code:
    ipkg install nzbget lighttpd php php-fcgi
    For correct umask to enable file deletion from other machines, download from sourceforge, then change to the directory where you put the ipk and:

    Code:
    ipkg install nzbget-0.4.1-testing-r178-bin-cs05q3armel-arm.ipk
    Code:
    vi /opt/etc/nzbget.conf
    edit as follows:
    1. $MAINDIR=/mnt/disk1/share/nzbget
    (or wherever you want it)
    2. Fill in news server section. 20 connections is no problem for nzbget on a LinkStation Pro!
    3. UMask=000
    or whatever you want.
    4. DirectWrite=yes
    5. I turned CRC on and retries for CRC on. Still fast.
    6. WriteBufferSize=-1

    Download nzbgetweb and put it in /opt/share/www/nzbgetweb/

    Code:
    opt/etc/init.d/S80lighttpd stop
    Code:
    vi /opt/etc/lighttpd/lighttpd.conf
    change

    # "mod_fastcgi",

    into

    "mod_fastcgi",

    change

    url.access-deny = ( "~", ".inc" )

    into

    url.access-deny = ( "~", ".inc", ".sqlite" )

    Code:
    vi /opt/share/www/nzbgetweb/settings.php
    Settings here are well documented in the file.

    Code:
    /opt/etc/init.d/S80lighttpd
    Find a postprocessing script you like, put it wherever.

    Edit nzbget.conf to tell it where you put it.

    Code:
    chmod a+x /path/scriptname.sh
    Code:
    nzbget -D
    http://LINKSTATION_IP_ADDRESS:8081/nzbgetweb/

    Done and done!

  5. #80
    Thanks Neil, done half of it already by myself but have to go out.. will try later and report... Did you check speed by comparing file size to time or take the figure nzbget gave you?

    What options did you enable in the CONF? PAR2? UnRAR?

  6. #81

    Talking

    Quote Originally Posted by methanoid View Post
    Did you check speed by comparing file size to time or take the figure nzbget gave you?

    What options did you enable in the CONF? PAR2? UnRAR?
    Par repair in nzbget.conf, other postprocessing (unrar etc.) has to be done by a script.

    The speed indication on the latest revision is more accurate. I didn't actually benchmark it, I just took it as read. It's definitely quick...

    EDIT:

    Benchmarks

    LinkStation Pro (400MHz ARM)
    128MB RAM
    750GB SATA drive
    XFS
    My max actual throughput ~16.1mbps

    Test file: 399.3MB NZB = 361.5MB actual download
    20 connections
    Web interface refresh rate: 1 second

    Download: 3 minutes, 6 seconds = 15.56mbit/sec
    Par check (repair not needed): 1 minute, 55 seconds
    Unrar: 1 minute, 3 seconds

    I'm happy with 15.5mbps+ of my 16.1mbps.

    Note also, I'm getting 50% of my max throughput using just ONE connection to the usenet server. This is a VERY efficient usenet downloader.
    Last edited by neilt0; 21-06-2008 at 21:29.

  7. #82
    Dont need to ipkg install php as php-fcgi includes it?

    nzbgetweb/settings.php

    Settings here are well documented in the file. - but I think for a guide we need to be a tad more explicit.. last line for example of the conf file?

    Could expand this:
    Find a postprocessing script you like, put it wherever.
    Edit nzbget.conf to tell it where you put it.
    chmod a+x /path/scriptname.sh

    Anyway I got the webserver up but not working. Whenever I try to upload a file I get
    "Error: Check the path and the permissions for the upload directory (option nzbdir)"

    I notice nzbget doesnt create directories when run so I had to create them manually in Windows so they show as nobody|nogroup for owner but I CHMOD -R 777 for them but still no joy...

    So currently I cannot use nzbget yet...

    And even when I can.. well it needs some extra tool as SABnzbd has a cool Firefox addin NZBstatus which would be nice to have with nzbget.

    I'm certainly willing to change... one plus for nzbget is it wont interfere with running a website like SAB can with port usage....

  8. #83
    Quote Originally Posted by methanoid View Post
    I notice nzbget doesnt create directories when run so I had to create them manually in Windows so they show as nobody|nogroup for owner but I CHMOD -R 777 for them but still no joy...
    It must create all directories. Probably a permission-problem. Are there any errors in log-file?
    Try to start the program from root-account.

    Quote Originally Posted by methanoid View Post
    Anyway I got the webserver up but not working. Whenever I try to upload a file I get
    "Error: Check the path and the permissions for the upload directory (option nzbdir)"
    First check if nzbget works by uploading nzb-files via samba or ftp. Later we could try to fix the problem with web-interface.

  9. #84
    Quote Originally Posted by hugbug View Post
    It must create all directories. Probably a permission-problem. Are there any errors in log-file?
    Try to start the program from root-account.

    First check if nzbget works by uploading nzb-files via samba or ftp. Later we could try to fix the problem with web-interface.
    hi, OK it was a warmware problem

    I editted the sample conf file and missed a ~ so my NZBget dir was ~/mnt/disk1/share/nzbget rather than /mnt/disk1 etc.... Ooooops

    nzbget now creates the dirs but wont let me copy files into them so I had to chmod -R 777 nzbget directory and now I can copy an NZB file in but NZBget isnt doing anything with it... I run NZBget with "nzbget -D"

    Nothing in log file bar NZBget running in Daemon mode.

    if nzbget is as good on a Linkstation as Neilt0 says I'll be keeping this Linkstation and saving buying a low power usage PC for downloads...

    My fingers are crossed!!

  10. #85
    Quote Originally Posted by methanoid View Post
    nzbget now creates the dirs but wont let me copy files into them so I had to chmod -R 777 nzbget directory
    There was a permission bug. Install the latest version from nzbget's home page with
    Code:
    ipkg install http://surfnet.dl.sourceforge.net/sourceforge/nzbget/nzbget-0.4.1-testing-r178-bin-cs05q3armel-arm.ipk
    Set the option "umask=000" in the configuration file.

    The created directories now should be accessible from all user-accounts, including samba.

    now I can copy an NZB file in but NZBget isnt doing anything with it...
    Check options "NzbDirInterval" and "NzbDirFileAge". Value "5" for the last option should be enough. Check if date/time is set properly on you linkstation and if nzb-file has a proper date/time (files downloaded from some indexing servers have errorneous timestamp in the feature).

  11. #86
    Yeah I had the correct nzbget and umask - I did read the "guide" post...

    It was the time though.. well spotted... as soon as I changed my NTP server it worked.. The default LS time server was dead.. put a proper one in and its downloading away.. I'll test web interface next!

    Thanks Hugbug, great spot!

  12. #87
    Well it downloads beautififully and easily maxes my connection which SAB didnt do... but I need to add post processing script to unrar (IMHO this should be in NZBget as standard!!), just integrate ydrol's script.

    Some questions if I may

    1) to add ydrol's script which is Perl I assume I have to add Perl - which? Full Perl or would MicroPerl do? Details at http://ipkg.nslu2-linux.org/feeds/op...table/Packages

    2) likewise I assume I have to install unRar (done) ?

    3) Reason web i/f wont work and allow UL of NZB MIGHT be the lack of any option in $rpc_api='' - which do I use? I have installed PHP-FCGI and used XML_RPC before when I used to use Hellanzb so is that what I choose?


    If I am totally honest I am VERY impressed with NZBget's speed but it's not as easy to configure as SABnzbd and as friendly. SAB has many more features but I guess they cost in one way or another? I like the way that SAB integrates with Newzbin website (can auto download bookmarked NZBs) and can process NZBs into categories (different file locations for say TV shows from Music downloads). I like auto PAR and auto RAR. I also have a great Firefox addin called NZBstatus which monitors SAB's progress.

    But speed matters... really.. and I think NZBget could be developed further (and seems to be)...

    Is there a road map for the future of this great package Hugbug? Please share it with us if you will

  13. #88
    I do not use any postprocessing scripts myself. Sorry I can't help on configuring them. I hope neilt0 could share his experience on that part.

    The reason to not using postprocessing and unrars for me is, that asus wl500gp has very slow speed accessing usb-attached hard drive (about 2-3 MBytes/sec). Because of that I can not use it as a shared drive, but need to copy downloaded files to desktop. And here we come to the point - the speed of copying files from wl500gp to desktop pc is the same as unraring the files from wl500gp to desktop pc. So unraring on wl500gp would be just unnecessary work for the router.

    3) Reason web i/f wont work and allow UL of NZB MIGHT be the lack of any option in $rpc_api='' - which do I use? I have installed PHP-FCGI and used XML_RPC before when I used to use Hellanzb so is that what I choose?
    The option affects communication between web-interface-app and nzbget-server. If you see download queue and log in web-interface, that means the communication works.
    You may leave the option empty. On optware-platform json-extension (built into php) will be used automatically; this is the best option for speed.

    The option does not affect uploading of the files. The uploading does not need any communication with nzbget-server. Web-app just put uploaded files into nzbget's incoming directory. If it does not work check permissions for that directory. Also try some other nzb-file. I have had problems with some files (with special characters in their names). For test rename any nzb-file to something very simple (test.nzb) and put it into "C:\", then try to upload it.

    BTW, from a local network it is easier to upload files using network-dir. I created a shortcut for nzbget's nzb-dir to sendto-menu of windows explorer. So any file can now be easily sent to nzbget via explorer's context menu.

    Is there a road map for the future of this great package Hugbug?
    I'm sorry to disappoint you, but I'm not planning any new features.
    Last edited by hugbug; 22-06-2008 at 18:33. Reason: corrected spell errors

  14. #89
    Perl is easy: ipkg install perl

    Web interface:

    vi /opt/share/www/nzbgetweb/settings.php

    I use:

    $serverip = '192.168.0.16';

    Make sure that's set to your LSPro IP.

    Other than that, dunno why it's not working for you. Hugabug may. All I can recommend is carefully check all settings in the configs.

    The major feature I miss from SAB is nzbdstatus, but nzbget is so fast at uploading NZBd via the web interface, it's not a huge loss.

    There are major advantages to nzbget: speed, RAM usage, libpar2 instead of par2 for single pass par2, scripts can be more sophisticated.
    SAB's fine for a fast PC -- I've used it for years, but for a device like the LSPro, I don't think there's any choice. It has to be nzbget.
    Last edited by neilt0; 22-06-2008 at 19:21.

  15. #90
    Another warmware error.. I mistyped the NZB file directory in the Settings.php

    Will need to remove PHP from the list of install requirements as PHP-FCGI installs PHP anyway and put in Perl if post processing script used and optional unRAR into install list.

    Files upload to NZB dir fine but won't start downloading - still looking into.

    Have to confess I am disappointed that no new features are planned at all. That's one thing I have liked about SAB, the devs have added stuff I have requested. Not big stuff but still they keep tweaking and improving.

    Fingers crossed the postprocessing script works now... my net connection gets throttled from 4pm so its chugging a bit...

    Are all my files supposed to be called

    (null)
    (null)_duplicate1
    (null)_duplicate2
    (null)_duplicate3 etc?? Seems I have another issue!

Page 6 of 9 FirstFirst ... 45678 ... LastLast

Similar Threads

  1. How to compile custom binary
    By Elephantik in forum WL-500g Q&A
    Replies: 5
    Last Post: 04-01-2009, 11:14
  2. how to compile binary for Asus native firmware
    By twpang in forum WL-500gP Q&A
    Replies: 0
    Last Post: 31-07-2007, 06:21
  3. Replies: 8
    Last Post: 10-06-2005, 13:24

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •