Oddbean new post about | logout
 #asknostr  is there ANYONE out there with a #bitcoin  price chart that is scaled to a power of two (log2) vertical axis instead of either linear or log 10?

i'd really like to see how it looks, i bet it's a straight line to the moon

the reason being, that its issuance rate is based on halving which creates a power of 2 "power" factor to the price movement... which means it's going to look like it's flattening out on a log10 - if i remember how these map to each other - a 20% deceleration over the 4 year timescale, which is your log(10)/log(2) which is effectively log(5) 
 Never seen one but sounds interesting. 

Might make it a side quest for today. But later.  
 👀
nostr:nevent1qqs9ssvkva8zuvk8sv83c2yjd0t6sdfvyx5cxk8nehg3tsrg8q3w2jqpzamhxue69uhkzarvv9ejumn0wd68ytnvv9hxgtczypxgqqjh5ky2s2zf6pyczlptm2kesje953ddnak66ehy05a50caj7qcyqqqqqqg2kak9a 
 It should be identical, in practice, just scaled.

The property of a log chart is that proportional changes have the same height. So 2x has the same vertical height whether it’s $0.01 to $0.02 or from $10k to $20k.

This proportionality property exists irrespective of the base used in the log operation. The raw values will differ, but it’s common to scale the height of the chart based on the display anyway to focus on what you’re trying to show. 
 https://cdn.satellite.earth/c01c24ba7e849f37fb86d0f9825d659b7d48559d04ade1e8c27a80f81d83848d.png

this is the comparison of the linear, log2 and log10 zero points on a graph

obviously the log10 (which is implicitly what is meant when log is used alone) is a flatter curve

the log2 curve decelerates slower and longer to reach the same gradient 
 https://cdn.satellite.earth/1ff6b87ed64a58d01af206edf1095d3909dd955d2ea1e4bafbf8ac0f29c7bed9.png

this represents the two different zeroes based on log10 (black) and log2 (red) with their respective double and half values, which is the range you'd see on the chart

i added this one to illustrate how especially in the early part of the graph the angle of the log2 version is going to mean a more linear result against time for our price chart 
 Yes, the log(2) curve will grow “faster”. Its values are scaled by a constant factor, namely log2(10) = 3.32.

But when you render a chart for display, there’s an implicit scaling between the actual values and the position (in pixels) of that visual element (line).

This implicit scaling factor arbitrary, and chosen by the charting library, or manually by the chart author, to showcase the phenomena under investigation.

You could add to this chart another log line with base, say, 10 billion, and the other lines would be entirely squashed at the bottom. But the shape of each line would remain the same, preserving the change proportionality property. 
 yes and my whole point is currently what we see is a faster flattening in the early part of the graph just like the graph of these curves shows

in any case, if fiat supply is growing roughly on a power of ten basis, but the bitcoin supply grows on a power of 2 basis, then yes, at a larger scale and longer timeline it's still going to flatten out, but not so soon as only 3 halving cycles in... you can literally correlate these relationships to the coordinates on this example graph 
 the proportionality is precisely related to the time

if bitcoin is a hedge against inflation, then you would see that against the right approximate ratio of the growth rates between the two, and the proportional changes scaled vertically so the more it goes up, the less it goes up on the chart, you flatten out the variance of the exchange rate and start to see the actual adoption rate over time, and if the proportional change rate matches up closely to what is actually happening, then it will follow a linear path, time versus value should find a close to x=y ratio somewhere between the linear price and the log10

understanding this ratio correctly, and since we are coming up to the end of the fourth cycle, should give a better estimation of the likely next cycle peak, and the ones after that, and as the time continues this ratio will tend to smooth out due to the averaging of the total timescale

i'm just saying, that because bitcoin's supply is power of 2, a log2 chart is the right one to use because the supply is decreasing at that rate, so its value should thus become close to linear at that scale 
 > i'm just saying, that because bitcoin's supply is power of 2, a log2 chart is the right one to use

I’m just saying that there’s no difference, from a charting perspective because their shapes are identical, merely scaled.

If you take a log10 chart, then fit it so that the min and max are at the bottom and top of the rendered height, then do the same for a log2 chart, the two charts will be literally identical, with the only exception being the labels on the y-axis.

Algebraically, using a different log base results in a linear scaling factor on the output. Plotting to pixels requires its own linear scaling factor. And so a log chart will look the same regardless of the base, provided the pixel scaling factor is adjusted accordingly. 
 saying they are the same is like saying "on a long enough timeline the survival rate for everyone falls to zero" or that "a circle normalizes to a line", it's an a = a expression

that's precisely my point... it fits a progression that has a base to it and it's somewhere around 2, for sure... it may diverge from that now but we have a long history and the divergence is not likely to be more than a standard deviation based on the correct baseline 
 Let’s reset. We’ll start with constant functions and work our way up to logarithms.

Consider these two functions:

f(x) = 1
g(x) = 2

Questions for you:

A) What do these functions look like when plotted on a traditional XY Cartesian plane?

B) Do they look the same or different? 
 you need to normalize data to its underlying scaling pattern, or you can't see the patterns in the visualisation

a circle looks like a line with a big enough coefficient compared to the area under the plot, that's the point... a circle is a line, in a partial higher dimension, and same goes for two dimensional statistical data comparing two variables (time and price in this case) - patterns appear in a cluster of data when you normalize it

it's not saying the graph is the normal pattern, necessarily, but if you find a good curve fit it is often predictive 
 > you need to normalize data to its underlying scaling pattern, or you can't see the patterns in the visualisation

Correct. In a data visualization, there is a projection from the domain of the data (time, dollars, etc.) onto the range of values supported by the visual media (pixels, centimeters, etc.). Choosing the function to apply (log/linear) and parameters (base for log, offset, scale) are arbitrary and made for aesthetic reasons.

> patterns appear in a cluster of data when you normalize it

Agreed that the pattern you see depends on the function applied and the parameters you choose. It is my claim that the choice of base and linear scaling parameter are functionally equivalent. The remainder of this post explains how.

Focusing on the Y axis, and assuming you have chosen log scale, here are the parameters you can choose:

- B - base for log function
- m - scale factor for linear projection
- c - offset for linear projection

The linear projection here is from log price to screen pixels. So the total function from price to Y coordinate is:

f(p) = m * logB(p) + c

Let’s consider the algebraic impact of choosing a different base, B’ for a different price projection function, f’:

f’(p) = m * logB’(x) + c

What is the relationship between f and f’, visually? Let’s find out.

It is a rule of logarithms that one can compute a value in a new base according to this formula:

logB(x) = logA(x) / logA(B)

So for us, that means that:

logB’(x) = logB(x) / logB(B’)

Substituting this into our price projection function f:

f’(p) = m * logB(x) / logB(B’) + c

Refactoring:

f’(p) = m * logB(x) / logB(B’) + c * logB(B’) / logB(B’)

f’(p) = (m * logB(x) + c * logB(B’) ) / logB(B’)

f’(p) = (m * logB(x) + c  + c * (logB(B’) - 1) ) / logB(B’)

Substituting our original f definition:

f’(p) = (f(p) + c * (logB(B’) - 1) ) / logB(B’)

Refactoring:

f’(p) = f(p) / logB(B’) + c * (logB(B’) - 1) / logB(B’)

Since B, B’ and c are all constants, what this last formula shows is that f’ is a linear projection of f. That is, it fits the form y=mx+b.

I hope none of the above is controversial (unless I’ve made a mistake in the math).

What does this mean for us, in a data visualization context? As noted earlier, visualizing a function requires projecting from the domain of the data into the range of pixel values. Putting log aside, this means, at minimum, picking a scaling factor and an offset. These values are arbitrary and chosen for aesthetic effect.

So irrespective of whether we use f or f’, we’ll end up linearly scaling the values to project them into pixel space using arbitrary, aesthetically chosen parameters. If we have the same aesthetic intent in both cases, we will select projection parameters that yield identical graphs. The parameter values we pick will be different, but the pixel values will be the same (by definition, since we have the same aesthetic intent for both bases).

This is what I mean when I say the graphs are identical. The only way in which they differ is by a linear scaling function, and we control arbitrary linear scaling parameters.

I hope this is clear. The choice of log base and the choice of linear scaling factor are in the same category of arbitrary visualization parameters. Moreover, they have the same effect. If you squish vertically by choosing a higher base, you can stretch vertically to counterbalance that choice by choosing a larger scale factor. 
 https://m.primal.net/Krji.png  using https://mempool.space/api/v1/historical-price?currency=USD 
 ala 

```python
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import requests


currency = "USD"

response = requests.get(
    f"https://mempool.space/api/v1/historical-price?currency={currency}"
)
data = response.json()
prices = data["prices"]

df = pd.DataFrame(prices)
df["time"] = pd.to_datetime(df["time"], unit="s")

plt.plot(df["time"], np.log2(df[currency]))
plt.ylabel(f"Historical {currency} Price [Log Base 2]")
plt.show()
```
 
 very nice... see, it does demonstrate my point, the data is much closer to a line in the log2

yes, i remember, actually, there is a nice simple web script you can make with a html embedding some CDN scripts and you push the data points into it and it renders a nice graph... i used it with a difficulty adjustment graph that compared hashrate to the block time

i think what i am digging at is that there is a particular curve that fits the chart closest to a linear progression, it could be log e, or maybe a little under 2, like maybe even the square root of 2 is a better fit

the purpose of the exercise is to illustrate how over 15 years there has been a fairly consistent path in the price curve that probably can be expected to continue, really, you probably can just overlay and offset the correct power curve line as the normal level but seeing it linear and scaled to make the SDs the same width helps a lot in understanding it i think 
 ah, and it actually looks like the log10 is closer to the natural coordinate system of the data

only just noticed.. thanks btw hmmm 
 My bad ... the units are different so can't plot them on the same graph in that way ... here's the updated code:

```python
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import requests


currency = "USD"

response = requests.get(
    f"https://mempool.space/api/v1/historical-price?currency={currency}"
)
data = response.json()

df = pd.DataFrame(data["prices"])
df["time"] = pd.to_datetime(df["time"], unit="s")

fig, ax1 = plt.subplots()
ax2 = ax1.twinx()
ax1.plot(df["time"], np.log2(df[currency]), color="Black", marker="o")
ax2.plot(df["time"], np.log10(df[currency]), color="#FF9900")

ax1.set_xlabel("Date")
ax1.set_ylabel(f"Log Base 2 Historical {currency} Price", color="Black")
ax2.set_ylabel(f"Log Base 10 Historical {currency} Price", color="#FF9900")

plt.show()
```

and the corresponding figure (i.e. it's the same shape):

https://m.primal.net/Krre.png  
 nostr:nprofile1qy88wumn8ghj7mn0wvhxcmmv9uq3zamnwvaz7tmwdaehgu3wwa5kuef0qyfhwumn8ghj7ur4wfcxcetsv9njuetn9uq3vamnwvaz7tm9v3jkutnwdaehgu3wd3skuep0qqsxzsz83jdwztcapd2qulzhspnyjvn6jxcypvrl0w3aahp40j4smfgchnfr5  you are right, it is the same

maybe a higher exponent will flatten it further but it's not hard to see it's already getting quite flat now anyway, like, there is a trend that is decelerating by about the same amount each cycle, but it gets closer and closer to a normal y=x*coeff as time goes on... with that power ratio though, so, just to remember in the linear it's gonna be ... well

somewhere between 150-300 this time, and probably 4x as much as that next time, assuming no hyperinflation, at which point the USD value means nothing anymore 
 Putting aside the math for a second, let’s talk about price.

Price is entirely psychology. The price of something is what one is willing to exchange for it.

The price of #Bitcoin is crossing an uncomfortable chasm. $60k is too large for most people to consider buying crazy internet money, but the price of a sat is too small to reason about. People are not good with decimals.

Once we hit $1M per whole coin, a sat costs $0.01. A penny stock. At that price, we’ll stop pricing in tranches of 100M sats, like a one time 1:100,000,000 stock split. Overnight, Bitcoin will go from $1M, to $0.01, keeping the BTC ticker.

At that moment, everyone’s interests are aligned—the hard money advocates and the degenerate gamblers alike. That’s when the real FOMO moon pump begins.

So I don’t expect us to flatten into a pure exponential growth curve (line in log scale) just yet. Personally, I think that happens somewhere between cent/sat and dollar/sat parity.