Seems like a useful site for music creators who want to post their music online. Good to have this in the arsenal.
It could definitely be helpful. I tried a file on there earlier that I had measured at 8-9 LUFS, and the feedback it gave looked about right per the specs.
I’m not convinced. I have tested a number of tracks, all mastered to -10 LUFS, yet the readings from the loudness penalty site vary widely between each track. Although most services do not use the BS 1770 algorithm to measure loudness, I would still expect whatever they are using to be consistently pro-rata to the LUFS value, therefore I can only conclude that the loudness penalty site’s own algo is suspect.
I think it’s a nice idea.
But I think it’s ultimately best if people aim for whatever loudness they think suits the song best.
I was feeling very clever a few months ago because I DIY mastered a song so it’d pretty much hit the Youtube LUFS target and I thought I’d won the game because my new video single would be more punchy with transients and dynamics than most other stuff.
A couple of people suggested to me that it was a bit too dynamic, and they were having to adjust volumes as it went into the loud choruses. So, making it less dynamic would have been better. Doh!
I take a “it is what it is” approach nowadays. Don’t mess around with these things too much, kinda mix and master to what seems good for me and go from there.
As I’ve had material broadcast on FM and different web streaming radio recently, I can tell you that none of these services is uniform, it is a complete mess from one to the other and you can never get it right. I think good songwriting trumps it all and I try to focus on that. People can always buy the high quality file later if they want it…too bad that music sales have become a thing of the past.
Great to see you guys are trying this out.
Just to respond to a couple of the comments:
In our tests the LP values are within half a dB of the real-world results. (Apart from iTunes/Sound Check, which is still only an estimate). You can test it yourself on YouTube using the “Stats for nerds” function (right-click on any video to access them).
The variability in the values is partly because each service uses a different reference level, but more importantly because only TIDAL use LUFS values. LUFS can only give you an approximate idea of the real values, which is exactly why we made the site.
Also, we agree that it’s not the best idea to use the results as “targets”. This is unlikely to work - why should an acoustic ballad be at the same level as a rock song, for example ? Instead, the idea is to use the site to get feedback and understand - if I master a song like this because I think it sounds good, how will it be affected by loudness normalisation, and am I happy with the results ?
For example, if a song is being turned down by 6 dB or more on all the services, personally I want to at least experiment with a more dynamic version, to see how it compares. (Which you can do live on the site, since we added a Preview function). On the other hand, I’m quite comfortable with a loud song being turned down by a dB or two on YouTube, if it sounds good to me.
Hope that helps !
Hey @ianshepherd! Thanks for joining in! Always great having folks pop in from the companies that are making the software we’re using.
Hope you’re liking the site so far. Feel free to spread the word!
But surely each service’s algo must be consistently pro-rata to LUFS? After all, LUFS is the internationally accepted standard. What’s the point in them using a loudness algorithm that doesn’t measure loudness correctly? Here are the results of three tracks, all mastered to exactly - 10 LUFS (Orban standalone):
Even the TIDAL estimate is inconsistent, and as you say, TIDAL uses LUFS.
Not measuring loudness correctly or not measuring loudness to the international standard? Just because something is a standard doesn’t mean it’s correct, or the best. It’s just a standard. And even if it was the best, I can’t see how being off by .2dB would be of any concern. That’s almost nothing.
You’re missing my point - or I didn’t illustrate it well enough. I’m not trying to show that the services have got their algos wrong, I’m trying to show that the tool’s algo is inaccurate, and therefore of limited use.
If you’re mastering for a living, it’s a concern. Besides, some of those measurements vary by 0.5db. If 0.5db is not a concern to you, fine, but it’s certainly a concern to me.
Thanks @holster, good to be here
@AJ113 I actually agree with you - if I were implementing loudness normalisation now, I’d use LUFS. And that’s exactly what TIDAL (who are the most recent to join the fun) have done. But not the others.
For example, someone sent me an example the other day where TIDAL wouldn’t change the gain, but Spotify would turn it down by over 3 dB, even though the nominal reference level is -14 LUFS on both services.
So as bozmillar says, they’re just all doing it differently, and that’s the value of the site. If LUFS were an accurate enough predictor, we wouldn’t have bothered to make it
As far as the discrepancies in the TIDAL values are concerned, I can’t say for sure without access to the files you’re using - would you be able to share them so we can test them ourselves ? In our tests the site agrees with measurements made in other apps, but I haven’t tried the Orban meter.
@ianshepherd - thanks for putting this tool out. I’ve been using it all the time just to see if my mixes are in the ballpark. I’m always hoping it’ll tell me my mix would be turned down a dB or two rather than up.
Like the fool I am, I didn’t check the original LUFS of the files I processed - they were not exactly -10. However, I did three more that are exactly -10 LUFS:
So at least now the TIDAL adjustment is consistent as one would expect. However, my original point stands: does the discrepancy across the same service mean that the service’s algo is incorrect, or does it mean that the tool is inaccurate? I mean, according to the tool, all three services are showing 1.5db variances (between their own measurements, not between each other) on files that are all exactly -10 LUFS.
As I said, in our tests there are no discrepancies between the measured LUFS values and the TIDAL results, so I’m not sure why you’re seeing the 0.2 dB difference. Do you have another tool to measure the loudness ?
As for the different services, let me check I understand what you’re asking. These are three different songs, all measuring -10 LUFS integrated, correct ? In which case the TIDAL values should all be the same (measured using LUFS) but the the values on the other services are expected to be different, because they’re measuring the files in different ways. They use different EQ curves, different gating thresholds etc.
The site’s values are accurate (usually within 0.1 of the measured results on the services) and the services algos are working correctly - they just reach a different conclusion about how loud things are, because they measure it in different ways. Does that make sense ?
I’m not, please see my last post. TIDAL is consistent, now that I have input tracks that are exactly -10 LUFS.
OK fair enough. In that case the services are playing a wacky game. How is anyone supposed to achieve the correct loudness if they either: 1. Don’t adhere to LUFS or 2. Don’t release their algo for public consumption?
That’s the thing - it’s not necessary to match their reference loudness. They will adjust the level for us, so we don’t need to worry, and can just make the songs sound great. It’s wise to use Loudness Penalty and/or LUFS measurements to make sure you’re not going way too loud (or too quiet, since YouTube and TIDAL don’t turn things up) but matching the loudness isn’t necessary. Or even advisable - an acoustic ballad shouldn’t sound as loud as a death metal song, in my opinion. (Bear in mind that all these services apart from YouTube & Pandora have an “album mode” which maintains relative levels between collections of songs)
I wrote more about this topic here:
(Also, remember several of these services implemented normalisation before LUFS was widely used. They’re not doing it to be awkward, there are practical considerations)
I would imagine anyone who has got any concern at all about loudness is measuring LUFS these days. What is the advantage of your tool over an LUFS meter?
I feel a bit like we’re going round in circles, now :-/
The Loudness Penalty site doesn’t use LUFS (except for TIDAL). We’ve analysed all the services and provide more accurate values than simple LUFS estimates, typically within 0.1 dB of the actual value. Apart from iTunes, which is only an estimate - but we’re working on it !
The advantage of the site over an LUFS meter is that it will show you exactly when an LUFS estimate is misleading - for example in the examples you listed, the third track will be reduced by over a dB more on Spotify than on TIDAL, even though those services have similar overall reference levels. I’ve seen other more extreme cases where the difference was over 3 dB.
You’re free to us use LUFS of course if you prefer - but the site is free if people want to try it