Hello, everyone. Long time listener, first time caller.
Here's what I'm setting out to do: because I can no longer reliably receive a delivery of the New York Times for my daily crossword obsession, I'm now a subscriber to their digital crossword. I still prefer to do the puzzle on paper, and fortunately the NYT offers a scale PDF of the puzzle each day. I'm hoping to write a script (I'm incredibly new to this) to download the file and print it every day so that I can feel like I live in a future where magic robots deliver my crossword puzzle to my printer each morning.
The file naming scheme is simplistic and consistent. As an example, here's today's PDF location:
Obviously, the only thing changing here daily is the file name. I'm hoping I can use today's date to scrape the relevant information and generate the correct filename on a daily basis.
From there I assume it should be trivial to download the file to a specific location and automatically send it to the printer. I've been poking around, and I have some level of confidence that I can pull off the downloading and printing parts. I don't, however, have any idea how to get started generating the daily file name.
Any assistance you can provide would be hugely helpful. I know the basic answer is "learn to code, it's really helpful." I'm willing to put some time and effort in here, but I figured this would be a great place to start.
I have come up with one problem. I want to use Automator to download a files from specific URL's which are located on mega.cz server (The same like dropbox).
So far i configured Automator as i need, but it's dead end for me when it comes to the file download. Files are stored on Megaupload server (www.mega.cz) and i don't know which task should i use in automator so it starts downloading the file on that specific server.
Here in the picture You can see where I am stuck.
In that Url https://mega.cz/#...... is a .rar type of file.
I Guess that it can't just understand what to download, becouse if i open that link in a browser it shows options like these. (Picture 2.)
I have tried all logical tasks in automator after a script where it filters Url's , but so far no success.
Maybe someone has an idea how to download these files.
Hi all, I was originally trying this with Automator, but if someone can guide me to accomplishing this easier with Applescript then feel free, though i haven't worked with it before. I am trying to get specific actions within a workflow looped. I just need the workflow to process a single image file continuously so the selected edits repeat within the 1 file until i stop the loop. Here are some basic actions i'm trying to get started with:
1 - get specified finder items 2 - copy finder items 3 - scale images 4 - crop images 5 - loop
The problem is that i want to loop only actions 3 & 4, but the loop starts over at 1 and creates a new file each time it loops. Can i create 2 workflows where the 2nd one (containing actions 3/4/5) starts when the first (containing 1/2) ends so only the 2nd workflow is looped? I'm not sure how to tell the 2nd one to start when the first ends though. Do i just set the input of the 2nd to use the 1st workflow file?
or if it's possible to run this as a service from within Preview that's fine too. Originally i had this setup as WMD (watch me do) actions but Automator won't record the selection tool. if i can get the selection tool recorded then i can keep it as a WMD based workflow. Acorn (for crop/scale i believe) & Extra Suites (Applescript app for recording the mouse click & draw) were apps i saw mentioned on other posts that came up on google searches which might be able to help accomplish this workflow or script.
Hi, there's a site out there called Big Cartel that allows you to setup online stores. As orders come in, they are organized on an order page. You can then click each individual order and a new page pops up with the order details. From here you can print the order. I'd like to create a script or automator app (or work flow) that will extract the individual URL of each order on the page and then save it as a web doc or pdf so I can print them as a batch of orders. Do you think this is possible? The whole process doesn't have to be scripted, but maybe at least to the point of extracting each URL and downloading the link. I can do the batch printing by myself. The code in the HTML for the list of orders is as follows (of course with sample names):