Subscribe now and choose from over 30 free gifts worth up to £49 - Plus get £25 to spend in our shop
in a column, all images (httl://blah.blah......blah.jpg) can I automatically download them somehow?
Do you want to automatically download them to a folder or to actually be embedded in the XLS?
Which operating system do you use?
Yep, save it as a webpage (if possible) then as long as they're hyperlinks (rather than just text on a page) then you can use Firefox add-on "DownloadThemAll!"
This is assuming the host has no hotlink protection
poly - Member
Do you want to automatically download them to a folder or to actually be embedded in the XLS?
Which operating system do you use?
Downloaded to a folder, it's a weekly thing I'll need to be doing.
Getting an excel from with a list of columns, one of which is the URL(just plain text).
I'll be converting the excel to a csv so I can datamerge in indesign. I need the images to be local though, hence needing to download them.
just supplied as text, not active links. anyway to activate them as links?joeydeacon - Member
Yep, save it as a webpage (if possible) then as long as they're hyperlinks (rather than just text on a page) then you can use Firefox add-on "DownloadThemAll!"
if you have all the URLs you could craft a set of curl command lines - copy and paste those lines to a terminal
[url= http://osxdaily.com/2014/02/13/download-with-curl/ ]http://osxdaily.com/2014/02/13/download-with-curl/[/url]
Yeah, could copy and paste the lot to notepad and do a search and replace:
Replace
http://
with
< a href="http://
(Remove the space before the "a", the forum won't let me type it otherwise)
and replace
.jpg (assuming they're all jpgs)
with
.jpg">Link</ a>
(Remove the space before the "a", the forum won't let me type it otherwise)
And then run DownloadthemAll and it should let you download all of the images.
Or depending how many there are, you could just convert the text to images, and then File, Save the webpage
So basically put
<img src="
before each one, and
">
After
Actually scrap my code, just use this: [url= http://www.multilinkr.com/ ]http://www.multilinkr.com/[/url]
I would use powershell, something like [url= http://stackoverflow.com/questions/23398479/download-multiple-files-from-http-using-powershell-with-proper-names ]this[/url]. Just need to save your URLs as a text file and copy and paste the code into PowerShell ISE.
if you have all the URLs you could craft a set of curl command lines
I was going to say, it's a piece of piss with curl or wget. If you're using Windows there's a port of wget - https://eternallybored.org/misc/wget/
Save the column as a text file, then it's just wget -i filename.txt
If you need to do it with native tools I'd be cracking out Powershell. You could probably suck the list straight out the spreadsheet with a few lines of code. I might have a crack at it later as a programming exercise.
Not used Powershell but in bash something like the following will do what you want. It assumes that all the URIs are in a text file, one per line.
do read -r uri
wget $uri
done < mylist.txt
I've not put any options to wget since there are loads and I might not provide the ones you want/need.
Edit: Or even the -i <filename.txt> as noted above.
Here. Automation FTW.
$file = "U:\My Documents\PowerShell\download.xlsx"
$sheetName = "Sheet1"
$rowURL,$colURL = 1,6
Import-Module BitsTransfer
$objExcel = New-Object -ComObject Excel.Application
$workbook = $objExcel.Workbooks.Open($file)
$sheet = $workbook.Worksheets.Item($sheetName)
$objExcel.Visible=$false
$rowMax = ($sheet.UsedRange.Rows).count
for ($i=0; $i -le $rowMax-1; $i++)
{
$imgURL = $sheet.Cells.Item($rowURL+$i,$colURL).text
Start-BitsTransfer -Source $imgURL
}
$objExcel.quit()
Copy this into Notepad. Change $file to be the absolute pathname to your spreadsheet, and $sheetName if your worksheet name is anything other than the default.
$rowURL and $colURL are the start point of the list. In my example you've got 1,6 which is the first row in column F. If your data was in column D and you had column headings, this would need changing to be 2,4.
Now save it with a .ps1 extension. (You need to change the "Save as type" dropdown to "all files" first, or you'll end up with download.ps1.txt.)
Double-click it, and it'll suck the data straight out of Excel and download all your files, no copying and pasting required. I haven't specified a destination, so it will download to wherever you launch it from.
EDIT: you'll probably need to configure PowerShell to run unsigned code first. Enter
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser into a PS prompt.
cheers all, bookmarked this thread to try some of this out. Got busy today, so didn't get the chance to come back to this thread.
I'll no doubt bump in a few days/next week with either success of failure.
ps work computer still on w7.
That script ^^ works in W7 cos that's what I wrote it on. There's any number of other download commands, which didn't work on PowerShell 1 or whatever W7 comes bundled with.



