recommendations for transcoding video camera footage?

  • 1
  • Question
  • Updated 2 months ago
What is the recommended format for transcoding footage into in order to edit videos smoothly in Camtasia.  I'm getting choppy playback in preview trying to edit my video files.  Screen shots of my media.  Thanks!
Photo of john

john

  • 117 Posts
  • 5 Reply Likes

Posted 6 months ago

  • 1
Photo of Ed Covney

Ed Covney

  • 539 Posts
  • 317 Reply Likes
What kind of computer & video card do you have? They all affect outcome.
I noticed 560V produced 60fps video and the other two sources 30fps so I think for best results would be to render at 30fps - the lowest common denominator.
Photo of Joe Morgan

Joe Morgan

  • 7599 Posts
  • 4152 Reply Likes
I would convert the footage at its original frame rate.

Why is that? Your 60 FPS Sony footage is probably going to look best at 60 FPS. Especially if it contains fast motion. If you want to apply clip speed and slow it down. Applying a slow-motion effect. Then absolutely, you want to have 60 FPS available to you.

If you’re just going to be intermixing this 60 FPS footage. With the Canon 30 FPS and the other Sony 30 FPS footage. And you’re not going to be doing much else with it other than viewing it in real time. Then 60 FPS might not be critical to the production. You could actually create a 30 FPS project. When the 60 FPS footage is added to the timeline. Half of the frames will be discarded automatically.

However, that should have nothing to do with how you choose to convert the videos. I would convert all my videos at their native FPS. Convert your 60 FPS videos to 60 FPS. Well, 59.94 FPS to be exact. Same thing with the other footage converted to 29.97.

Your project settings will dictate how many FPS the project will be. If you set Camtasia’s project settings to 60 FPS. You can do everything that you wish with your 60 FPS footage. Importing your 29.97 FPS footage that will become 60 FPS when placed on the timeline. It simply doubles every frame which doesn’t harm a thing. It doesn’t create jitter, and you cannot see it when viewing the final video.

Same story with projects created at 30 FPS. 60 FPS footage will be reduced to 30 FPS when placed on the timeline. As I mentioned earlier. It simply drops half the frames evenly. And the video plays smoothly.

Your converted footage remains versatile and ready for any project down the road. Or, ready if you change your mind tomorrow. And change the frame rate of your project in the morning.

If your project is going to be 30FPS.There is a performance gain if the 60FPS footage is already converted to 30FPS.So there is that to consider as well.

Just convert it to an MP4.

Regards, Joe

(Edited)
Photo of john

john

  • 117 Posts
  • 5 Reply Likes
thanks guys.  It's not the FPS I'm concerned with.  Editing ANY footage shot with any of those cams is pretty sluggish with Camtasia.  It always has been for me in my 10+ years of using Camtasia .  Granted, I built my pc years ago and it is aging, but compared to Pinnacle Studio, Sony Vegas, and Premier Pro, Camtasia is by far the weakest.  However, editing screen captures has always been smooth (unless I'm adding a ton of elements to a 10 minute clip).

So, I was just wondering if anyone had a reference codec and settings they use for transcoding to a more digestible format, like mpeg2, mp4 with a certain bitrate, etc.

pc specs-
Win 10 edu,
GB GA-990FXA-UD3,
AMD FX-8370 @ 4GHz, 
G.SKILL 32GB 1610MHz, 
GTX 1080 FTW,
Sammy 1TB SSD,
Monoprice 27" 2560x1440,

Photo of Jack Fruh

Jack Fruh, Champion

  • 575 Posts
  • 197 Reply Likes
This is an old post but I it was highlighted on another post so I thought I'd add to it.

That GTX1080 will enable programs designed for it to take advantage of the Cuda cores for decoding/encoding.  The current version of Camtasia has a setting on the advanced tab "Hardware Acceleration" you might check and see if this says "Use graphics card - NVIDIA GeForce GTX 1080"

All 3 of your videos are h.264, which should be fine - the low profile is supposed to be less demanding for decoding, so you could try that but I suspect that's not the bottle neck.

You didn't mention in your post what version of camtasia you have, and from memory I don't remember what release added GPU acceleration.  To put some closure on this it'd be good to know what version of camtasia you have and if you have hardware acceleration enabled. The rest of your system looks mostly ok (the FX series processors are a bit long in the tooth, but this should be mostly handled in the video card and the 1080 should be more than enough)  Also I presume you have 64 bit windows to take advantage of the 32GB ram.
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
I've had good luck getting Camtasia to ingest my files and let me work on them by using HandBrake and exporting to MP4 using one of the Production presets.

I tinkered with the Production Standard preset to add some sharpening and optimize for film, and Camtasia has no problems rendering.

I also had to NOT use my graphics card and instead use the built-in graphics of my Intel processor, as apparently TechSmith has not tested this software with my mainstream AMD Radeon graphics processor. However, even using the Intel GPU, it's reasonably snappy as long as I don't layer too many animations and behaviors and images on top of each other.
Photo of Ed Covney

Ed Covney

  • 538 Posts
  • 317 Reply Likes
David - I just went thru 11 weeks of heck with an ASUS mother board, that wouldn't disable the onboard graphics card (Intel 630). In the end ASUS traded me for a motherboard/BIOS that worked. So when I see a Device Manager, that lists an on-board graphic card with a user feature graphics card, I assure you, your troubles have just begun. 

In my case, the on-board card that I wasn't using stole 16 PCI lanes that my PCIe RAID card couldn't use (at least with my GTX 1070). The RAID card held 4 x m.2 chips, but only recognized one in an x4 slot. 

So if your able, try disabling (in BIOS) your motherboard based graphics card with your PCI card still in place. When you boot, if "device manager" lists both, the BIOS and/or MB messed up)- i.e. you have a broken MB and/or BIOS. If disabled, both CPU and your feature PCI card can still use 16 lanes and love it. 3 devices cannot use 16 lanes each, so your BIOS (HAL) made the decision for you, #1- CPU, #2- onboard GPU, #3- 4 PCI lanes (garbage) for your add-on PCI_1  x16 slot . That's why the on-board GPU worked better for you, even tho it might be 5-20 times slower than the card you added.

If any TS folks are following:
Windows knows when their are 2 Video cards that are contending for 16-PCI lanes each. So why not automatically switch to an on-board card any time Windows sees each? The more you can automate for users the better, right?
(Edited)
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
Ed,

Thanks for the idea and the explanation of why the AMD card may be slower than the onboard graphics. Unfortunately, however, I just checked my BIOS settings, and there's no way to turn off the onboard graphics processor or to change any settings related to it. I'll keep this in mind if I get or build a new machine for video editing, which is likely if I keep creating courses.

David
Photo of Ed Covney

Ed Covney

  • 538 Posts
  • 317 Reply Likes
David - Just to be clear, your BIOS should by default turn off the the on - board graphics chip the moment you insert your feature card. Contact you mother board manufacturer and see if they can upgrade the BIOS. This has been an IBM (PC) standard since 1981: only one VGA. For my recent experience with ASUS, I kept up the pressure until they either fixed BIOS or send me a new board that works right (and they did).

Although it's no longer a problem for me, it took me 10 weeks to reach an engineer who acknowledged it was their problem and their duty to make right, and a week to get my new board. At 71, I had the time, the motivation, and am stubborn enough to not let it go. 
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
This is a laptop, so the AMD graphics card came with the machine. My understanding, which may well be wrong, is that they keep both GPUs active because the on-board chip is less power hungry than the external GPU, and that Windows and apps can switch between them. Is that not correct? It would be nice to use the AMD chip when the machine is plugged in, but I can't see any way to set that up. Both are set to "maximize performance" in Windows power management. 
Photo of Joe Morgan

Joe Morgan

  • 7598 Posts
  • 4152 Reply Likes
I'm don't use a laptop, I was unaware of this technology.
Nvidia Optimus does exactly what you describe.

Pretty cool, makes good sense..........Heres a description...........

NVIDIA Optimus technology (not available on 3D panel)

NVIDIA® OptimusTM technology automatically optimizes your battery life while maintaining the graphics performance you expect — completely, seamlessly and transparently — whether you’re watching a movie, surfing the Web or playing a game.

How does it work?

  • This intelligent graphics technology switches between discrete and integrated graphics processors automatically whenever it determines what kind of application is being used. If you are simply surfing the Web, the GPU switches to the integrated version, therefore helping to extend your battery life. It's that easy to experience long battery life and amazing visuals without having to manually change settings.

  • Watch an HD movie, surf the Web or play games knowing you can get the long battery life you need and the performance you expect from NVIDIA Optimus technology.

Photo of Ed Covney

Ed Covney

  • 538 Posts
  • 317 Reply Likes
David - How is the 2nd GPU attached? Does the BIOS see it? In device manager, where does it display the two GPUs? Thanks.

(Edited)
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
The two GPUs are displayed under Display adapters, first the Intel GPU, then the AMD Radeon. The BIOS interface doesn't have any GPU-related settings, so I have no information from the BIOS point of view.
Photo of Ed Covney

Ed Covney

  • 538 Posts
  • 317 Reply Likes
David - How is the 2nd GPU attached? 
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
As I said, it's a laptop. I assume there's a miniature Radeon GPU card in there, but I haven't taken the computer apart to find out.

What are you trying to determine? As far as I can tell, it works the way it does by design, to optimize power usage when on battery.

I do almost all my video editing plugged in, so it would be nice to be able to bypass the Intel GPU when the computer is plugged in, and see if Camtasia uses the Radeon card more effectively,  but I suspect that's not possible with this machine. Also, I"m not sure it would work. I had to update the Radeon driver with a more recent version than Dell recommends just to get Camtasia to run with it at all. I think the problem is mainly on the Camtasia end, not the motherboard, though it would be interesting to find out.

David
Photo of Ed Covney

Ed Covney

  • 538 Posts
  • 317 Reply Likes
"What are you trying to determine? " 
If both video cards are on the PCI bus.
Probably true and it tells me that it's 100% likely a hardware problem if you can't turn off one of the cards.
What model Dell? I can't fathom that they'd install a great AMD GPU and leave a an albatross wasting space and resources.
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
It's an Inspiron 5770. It's Camtasia that has issues with the Radeon GPU. And I think it's a Camtasia issue.
Photo of Joe Morgan

Joe Morgan

  • 7599 Posts
  • 4152 Reply Likes
I may have found a solution David Bookbinder

This article covers doing what your attempting through the "AMD Catalyst Control Center". The instructions cover a Dell computer. But the Catalyst Control Center is what does it.
This should apply to all computers using it.I had an AMD card in a desktop years ago.So I used it myself.

https://www.dell.com/community/Laptops-General-Read-Only/How-to-switch-from-integrated-intel-video-c...

This switchable graphics tab kind of says it all.



However, the Control Center has been updated and changed its name. I would think you can access the same settings.
You'll have to sort this out.https://www.amd.com/en/technologies/radeon-software

Hopefully, this will work. Joe
Photo of David Bookbinder

David Bookbinder

  • 43 Posts
  • 27 Reply Likes
Thanks for the looking. I saw something similar on a YouTube video. However, from what I can see, Radeon has redesigned its interface completely since then.

I don't see the functionality in the new interface to switch processors depending on whether the computer is running on battery or plugged in anywhere in the new Adrenaline interface. In any case, the problem I'm having is not that I can't tell Camtasia to use the Radeon card, it's that it performs poorly when I do, so I have to tell it to use the built-in GPU.