Links on captures

  • 3
  • Question
  • Updated 3 months ago
I capture a lot of material for school with embedded links for reference material.   Is that something that is an option to have links work?
Photo of twillrick

twillrick

  • 1 Post
  • 0 Reply Likes

Posted 1 year ago

  • 3
Photo of jcthewizard

jcthewizard

  • 24 Posts
  • 7 Reply Likes

Snagit did provide the ability to capture url’s from links on the captured documents until version 12.  In my past I was an ‘expert witness’ and I used that feature often when doing legal research and preparation as the links provided detailed proof as to what the web site shared with others and it could be proven if the links changed to another source, of perhaps to another web site that was not allowed to be connected to the original web sites company.

I asked Techsmith if they would restore that feature and they claim their customers had no need for it and so they choose not to support it any longer.  Instead they encouraged me to go back to version 12 which does not capture scrolling web pages as well as version 18 and 19 do.  Such a shame.





Photo of Jay.Cummings

Jay.Cummings

  • 7 Posts
  • 1 Reply Like
We definitely need this feature!!!
Photo of Paul

Paul

  • 1644 Posts
  • 1245 Reply Likes
Links are separate items that are applied as a separate attribute to an image in the source and target locations.  They are not an inherent part of the image itself.

If your sources is a web page, you could expose the URL by turning on Capture Information effect, copying the URL from the browser address bar and applying it as custom text, as shown below, but the URL is not linked and is therefore not clickable



If the source is not a webpage you could use the same technique to cite the source

Or perhaps I have misunderstood your question?
Photo of Jay.Cummings

Jay.Cummings

  • 7 Posts
  • 1 Reply Like
Clickable links for URLs and embeded links would be extremely valuable!
Photo of jcthewizard

jcthewizard

  • 24 Posts
  • 7 Reply Likes
I think my purpose is maybe misunderstood.  For forensic purposes; to document what was included on a web page at a specific time, capturing the embedded URL and including it with each link on a produced PDF of the web page is perfect.  As I mentioned version's up to version 12 did this, and of course Adobe Acrobat provides this.  The feature is greatly missed as snagit can no longer be used in this capicity leaving hundreds, if not thousands,  of expert witnesses without an easy way to document what was on a web page.
Photo of Rick Stone

Rick Stone

  • 6629 Posts
  • 3187 Reply Likes
If memory serves, the ability to have links from an image was also related to the image being saved in a SWF format. And Steve Jobs effectively killed that format in favor of HTML 5.
Photo of Jay.Cummings

Jay.Cummings

  • 7 Posts
  • 1 Reply Like
It would surly be great if there is an option for SnagIt to overcome any technical format limitations.  
Photo of Rick Stone

Rick Stone

  • 6629 Posts
  • 3187 Reply Likes
I suppose that technically they could. But what it would require is for SnagIt to save out some HTML and JavaScript in addition to the image.

When things were in SWF format it was simple. You just spat out a SWF, then you could upload the SWF to your web site and you were done.

But making it happen with HTML and JavaScript complicates things considerably. In order to do that, the scripting and HTML code has to be maintained so it works in all browsers. Then you have the compounded issue of users being befuddled because they no longer have a single file to upload and use. 

Based on that, I can see where TechSmith would maybe be a bit on the reluctant side to enter that arena because it opens a bit of a can of worms.

Of course only TechSmith can say for sure whether it's something that would be on their radar. 

Cheers... Rick :)
Photo of Paul

Paul

  • 1644 Posts
  • 1245 Reply Likes
Not to mention that any image that came with an HTML and javascript payload would be treated as a virus by most computer systems :)

Realistically, it's never going to happen - the technical challenges are disproportionate to the benefit.
Photo of jcthewizard

jcthewizard

  • 24 Posts
  • 7 Reply Likes
Rick Stone: In the past (for me ver 9 through 12) obvious links; text and embedded url's, as well as url's associated with graphics, were maintained in a 'save as' PDF.  That technology exists.  Considering that flash is diminishing in use and that some url's are treated as a virus (as Paul wrote) are two different issues; URL's considered as a virus are usually controlled with your virus protection.  IF 'my job' were to document what url's a web page is pointing to, I would set that tolerance high so as to allow capture.  If some objects (i.e., graphics) use unreadable url's, so be it.  I would MUCH rather be able to capture as many URL's as possible, instead of none at all.
Photo of Rick Stone

Rick Stone

  • 6626 Posts
  • 3186 Reply Likes
Hi there

Apologies, but I'm not sure I fully comprehend and understand this statement:

In the past (for me ver 9 through 12) obvious links; text and embedded url's, as well as url's associated with graphics, were maintained in a 'save as' PDF.  

Care to expound on that?

Are you perhaps saying that you used the older feature to add hotspots to the image and instead of saving as SWF, saved to PDF instead? Assuming so, I believe it's likely that the SWF one would normally get would simply be embedded inside the PDF. And ultimately, I suppose it really doesn't matter exactly how it worked behind the scenes, only that something was produced that allowed you to click and visit one or more URLs?

Additionally, what seems confusing to me is mentioning capturing URLs. When I think of SnagIt, I tend to think of capturing images. I do understand that URLs can present pages that present images, but I'm having a disconnect in pondering how URLs would relate to a captured image. Unless, of course, it's something like we see on the facebook. Where an image is actually linked to a URL that presents the image and an associated news article.

Cheers... Rick :)
Photo of Paul

Paul

  • 1644 Posts
  • 1244 Reply Likes
Rick, I believe the OP needs to be able to cite the source of a capture and have that as a clickable link.

But every web app that I know requires you to specify the image and URL separately.  Facebook simply resolves the URL you give to the linked image on the site.
Photo of Jeff V. Pulver

Jeff V. Pulver

  • 41 Posts
  • 11 Reply Likes
I also miss that feature. It was very usefull. I have a workaround, but it is cumbersome. I click on each link and create a shortcut with that link as the target. I name the link the same name as the MHT image I captured, appending a sequential number starting with 01 for each link. Those shortcuts are saved in the same folder as the captured MHT web page.
Photo of s.nicholson

s.nicholson

  • 2 Posts
  • 0 Reply Likes
I also miss this feature.

I used it in the past when capturing all, or part of a webpage in a browser and then saving that to a pdf.  It meant that I could take a snapshot of a page and keep all of the links intact.

I have to revert to printing to a pdf and then marking up the pdf in another application - which is not nearly as user friendly, and does not result in nearly the same result.

I would REALLY appreciate, and use this feature if it was available.  Please. :o)
Photo of Jay.Cummings

Jay.Cummings

  • 7 Posts
  • 1 Reply Like
Case in point just yesterday I needed to share an informational capture which included several URLs.  Since I couldn't get the URLs to capture and remain live, I had to retype everything.  Very frustrating.  I couldn't deliver the beauty of the screen shot while typing, but I had to also deliver the links.  Bottom line for me was several hours and a less than pretty delivery to my customer.  I simply have to find a different solution that provides this capability.
Photo of Rick Stone

Rick Stone

  • 6629 Posts
  • 3187 Reply Likes
Can you expound on that? I'm getting a disconnect when you say: I couldn't deliver the beauty of the screen shot while typing, but I had to also deliver the links.

What did you ultimately end up doing to accomplish your goal? Use SnagIt with something else? If so, what? How did you manage to deliver something with working links?

I'm asking because the answer might help others seeking a similar solution.
Photo of Jay.Cummings

Jay.Cummings

  • 7 Posts
  • 1 Reply Like
100% manual effort.  tried pasting onto snagged image, but just couldn't match fonts and other items effectively.  I was very disappointed.  and I tried several approaches to accomplishing my goal so it took a very long time.
Photo of Rick Stone

Rick Stone

  • 6629 Posts
  • 3187 Reply Likes
I just did some poking around using the great and mighty Google machine. Looks like there is an extension you can get that is called "Fireshot" that allows capturing web pages and saving as a PDF with hyperlinks that function. Perhaps give that a try? 

I just tried it using Google Chrome and it seems to work.
Photo of Jay.Cummings

Jay.Cummings

  • 7 Posts
  • 1 Reply Like
cool, I'll search for it
Photo of Mike62

Mike62

  • 598 Posts
  • 167 Reply Likes
I believe there is another thread covering more or less the same matter?
https://feedback.techsmith.com/techsmith/topics/capture-url-links-in-captures-from-browsers-and-docu...

Think Paul came up with the idea of adding URL to "Capture Info". It could be that an URL is too complicated. For instance it may be (way) too long.
Maybe only the domain of the site would be sufficient(?)
However, what SnagIt does capture is the 'Window Title'. R-Click on the thumbnail in  'Recent' at the bottom.

=



BTW...

SnagItEditor -> Library (left window, at the bottom)

=




With me it shows quite a number of websites, but the number of screenshots taken from these sites is way, way incomplete.
For instance, it shows 4-5 screenshots taken from the TechSmith domain, but I've captured a lot more in the past.

It puzzles me why it is so incomplete... maybe someone can comment on that??

That aside, this 'Web sites', would it show that SnagIt is indeed able to keep track of at least the domain URL ?
Domain-URL, together with 'Window Title' (details), would provide a reasonable hint as to where the screenshot has taken from.
Photo of s.nicholson

s.nicholson

  • 2 Posts
  • 0 Reply Likes
@Rick - thanks for the heads-up Fireshot. 

Initial tests show that it seems like a viable option for capturing a webpage and retaining the links within the page. 

Unfortunately, the resultant pdf has compatibility issues if you need to edit it at all.  It also looks like you require the paid version to avoid extra mouse clicks due to ads - it's unclear if the results from the paid version are more compatible for the purposes of editing.

Maybe one day this feature will return to Snagit!
...
Photo of theresarui417

theresarui417

  • 1 Post
  • 0 Reply Likes
Your site has a lot of useful information for myself. I visit regularly mycfavisit
(Edited)