@tommo or @Troy - I’ve seen a clip here or there where it looks like you’re watching the down-the-strings view that’s shot from a phone on a nearby laptop screen, in real time. Is that something we could easily duplicate at home? How are you doing it?
Doesn’t sound familiar. Short answer, this is not useful, even if you could do it. You need to learn these technqiues by feel, not by watching. The watching can be done afterward as a spot check but if you watch while trying to play you’ll “learn” the mirror or video feedback and you sometimes can’t replicate the technique without the video or mirror. So I wouldn’t worry about doing this.
Yeah sometimes for me it’s been like: “I can play this well in front of a mirror if light is coming from my right but not if light is coming from my left”! Weird brain stuff.
Are we talking about the magnet here?
Not just the magnet, but when @Troy does interviews, there is software he uses to let him see the magnet view on a computer screen in real-time (or close to it), similar to watching through a webcam. I think it’s mainly for making sure the framing is good when he’s “on the clock” with an interviewee.
That’s not actually the case — we just record directly on the phone and have no way of seeing what it “sees” until we dump the footage. I think you’re thinking of the original rig which was a small computer-connected camera. That camera had no onboard recording and could only record via computer, albeit at data rates that were too high to record continuously on a laptop. Which is why all those clips are like 10 seconds long. If I had to do it over again, I would have built some giant disk that could actually do the complete interview non-stop.
Sorry for getting that wrong.
But did you ever stream from the magnet view during one of the live “Talking the Code” segments a few years back, or is the July heat just scrambling my brain today?
We might have tried something like that. We’ve looked into this at various points over the years. More recently, we’ve tooled around with the ability of IOS to transmit its screen to a host computer through a lightning cable. You need an app on a desktop machine that can receive the signal. It works, but there’s a delay so it’s disorienting. And of course you can only go as far as the cable. There are other apps that will do it over wifi but the delay is even larger, the resolution is lower, and the network connection is spotty. None of these work with 120fps video enabled.
In theory for a live broadcast with either of these options, we could delay the other cameras and mics to match that delay of the phone and then have synchronized video coming from the phone. But you can’t record on the phone at the same time, and of course the lack of 120fps. So we basically stopped looking into this because for an interview we need the slow motion phone recording more than we need the live feed.
If someone figures out a way to record on the phone in 120fps while still transmitting reliable live video to a host computer for broadcast or other purposes, we might look into it again. I soft of doubt anyone will figure out how to transmit HD 120fps video over a network, at least not without H265 or something. The data rate is just too high.