Active Topics

 


Reply
Thread Tools
Posts: 692 | Thanked: 264 times | Joined on Dec 2009
#1
Hey all, I'm about to finish a really awesome shell script (not Maemo-related, but still very handy) but one last thing is holding it up. I need to run wget '$url' where the content of the $url variable is actually substituted, but the single quotes prevent that. So the command should be run like:

Code:
wget 'http://site.com/whatever.html'
It is absolutely vital that I use single quotes here, just like this, or the operation won't work (long story about wget, but trust me there's no other way)

So what escape sequence can I use to make the contents of my variable come out between single quotes? Give me a working answer and I can share my handy script!
__________________
"Impossible is not in the Maemo vocabulary" - Caballero
 
Posts: 1,048 | Thanked: 1,127 times | Joined on Jan 2010 @ Amsterdam
#2
Eh... Just a man entry...

There are three quoting mechanisms: the escape character, single quotes, and double quotes. A non-quoted backslash (\) is the escape character. It preserves the literal value of the next character that follows, with the exception of <newline>.

If a \<newline> pair appears, and the backslash is not itself quoted, the \<newline> is treated as a line continuation (that is, it is removed from the input stream and effectively ignored).

Enclosing characters in single quotes preserves the literal value of
each character within the quotes. A single quote may not occur Between single quotes, even when preceded by a backslash.

Enclosing characters in double quotes preserves the literal value of all characters within the quotes, with the exception of $, ‘, and \. The characters $ and ‘ retain their special meaning within double quotes. The backslash retains its special meaning only when followed
by one of the following characters: $, ‘, ", \, or <newline>. A double quote may be quoted within double quotes by preceding it with a backslash. When command history is being used, the double quote may not be used to quote the history expansion character.
 
Posts: 472 | Thanked: 442 times | Joined on Sep 2007
#3
result=`wget \'${url}\'`
echo ${result}

__________________
If you don't know how to check your N900's uptime, you probably shouldn't own it.
 
Posts: 692 | Thanked: 264 times | Joined on Dec 2009
#4
Originally Posted by Laughingstok View Post
result=`wget \'${url}\'`
echo ${result}

I'm trying to actually run the command though...
__________________
"Impossible is not in the Maemo vocabulary" - Caballero
 
Posts: 2,802 | Thanked: 4,491 times | Joined on Nov 2007
#5
Originally Posted by GameboyRMH View Post
It is absolutely vital that I use single quotes here, just like this, or the operation won't work (long story about wget, but trust me there's no other way)
That sounds like an interesting story in itself, please elaborate :-)

I assume the URL in question contains "interesting" characters which cause some breakage when used with double quotes on the command line. Have you tried using wget -i instead?
 
Posts: 692 | Thanked: 264 times | Joined on Dec 2009
#6
Originally Posted by lma View Post
That sounds like an interesting story in itself, please elaborate :-)

I assume the URL in question contains "interesting" characters which cause some breakage when used with double quotes on the command line. Have you tried using wget -i instead?
This is basically the problem I'm having:

http://www.planetmike.com/2005/04/12/wget-and-urls-with-ampersands/

I'll give wget -i a try, but putting the URLs in a file would be highly impractical...this script iterates through a list of URLs using cut.
__________________
"Impossible is not in the Maemo vocabulary" - Caballero

Last edited by GameboyRMH; 2010-12-03 at 13:09.
 
Posts: 992 | Thanked: 738 times | Joined on Jun 2010 @ Low Earth Orbit
#7
Originally Posted by GameboyRMH View Post
This is basically the problem I'm having:
...
Huh? That page just displays "No"??

Anyway something like:

Code:
URL="http://www.example.com/some web page.php?a=1&b=2&c=3"
wget "${URL}"
works for me.
 
Posts: 692 | Thanked: 264 times | Joined on Dec 2009
#8
I fixed the URL and I'm now working on the script...
__________________
"Impossible is not in the Maemo vocabulary" - Caballero
 
Posts: 2,802 | Thanked: 4,491 times | Joined on Nov 2007
#9
Originally Posted by GameboyRMH View Post
This is basically the problem I'm having:

http://www.planetmike.com/2005/04/12...th-ampersands/
Ampersands are neutered just fine in double quotes as well:

Code:
$ wget -nv "http://talk.maemo.org/attachment.php?attachmentid=16109&d=1291329691"
13:27:52 URL:http://talk.maemo.org/attachment.php?attachmentid=16109&d=1291329691 [12750/12750] -> "attachment.php?attachmentid=16109&d=1291329691" [1]
I'll give wget -i a try, but putting the URLs in a file would be highly impractical...this script iterates through a list of URLs using cut.
-i can also be used to accept URLs fed to it via stdin, eg:

Code:
$ long-pipeline-using-cut | wget -i -

Last edited by lma; 2010-12-03 at 13:30.
 
Posts: 692 | Thanked: 264 times | Joined on Dec 2009
#10
WOOHOO Success! I had one other problem but wget "${URL}" did the trick!

Stand by for sweet script!
__________________
"Impossible is not in the Maemo vocabulary" - Caballero
 
Reply


 
Forum Jump


All times are GMT. The time now is 14:46.