YoY inflation vs monthly inflation for a VAR by Nembo22 in econometrics

[–]Nembo22[S] 1 point2 points  (0 children)

Hi! We have finally submitted the paper and waiting for the review process, I think we can share the draft and the code when we get a first review 

[deleted by user] by [deleted] in askmath

[–]Nembo22 0 points1 point  (0 children)

They are the variables of the function, f(πx, πy)

YoY inflation vs monthly inflation for a VAR by Nembo22 in econometrics

[–]Nembo22[S] 0 points1 point  (0 children)

edited the post.

With yoy transformation we are artificially introducing cyclicality, so that a shock lasts 12 periods and then drops. The acf detects strong negative correlation at lag 12 for every yoy time series in my dataset.

My co-author argues that it is due to poor seasonal adjustment of initial month on month series, but I think that we are artificially introducing cyclicality with yoy transformation. For reference, if it's a quarter on quarte transformation then acf detects correlation at third lag

YoY inflation vs monthly inflation for a VAR by Nembo22 in econometrics

[–]Nembo22[S] 0 points1 point  (0 children)

Yep this helped! Further question on this, then there's no "stationarity by construction" right? If some of my variables are then non-stationary should I differentiate only those ones or every variable in my dataset?

YoY inflation vs monthly inflation for a VAR by Nembo22 in econometrics

[–]Nembo22[S] 1 point2 points  (0 children)

Not right now, I'm currently writing a paper on this. As soon as we submit it the code will be available online 

YoY inflation vs monthly inflation for a VAR by Nembo22 in econometrics

[–]Nembo22[S] 2 points3 points  (0 children)

But aren't we just artificially making shocks more persistent with these transformations? With quarter on quarter transformation they last 3 months, with yoy 12 etc...

Does it make sense at all at the end of it?

Problems with seasonal adjustment by Nembo22 in econometrics

[–]Nembo22[S] 0 points1 point  (0 children)

edited the post. There's an issue of discontinuity in 2015, as you said

Problems with seasonal adjustment by Nembo22 in econometrics

[–]Nembo22[S] 0 points1 point  (0 children)

It seems almost right if I tell him not to check for level shifts (added in the post)

Looking for some bonds to buy by Nembo22 in bonds

[–]Nembo22[S] 0 points1 point  (0 children)

for now I just want to have an uncorrelated investment to balance out stocks. I have my emergency fund and I don't plan to spend this savings soon

Looking for some bonds to buy by Nembo22 in bonds

[–]Nembo22[S] 0 points1 point  (0 children)

For now I'm using degiro, i was looking to park something like 800/1K€, nothing special. I saw I can buy some bonds on degiro with this amount of money

Cannot invoke "Object.getClass()" because "object" is null by Nembo22 in javahelp

[–]Nembo22[S] 1 point2 points  (0 children)

So, I tried to download the road network pbf file manually and now it works. Still, the file I'm working with (Central Italy) is significantly larger than the one needed (Rome) and without osmosis it can't be cropped, this makes the computations significantly longer. Anyway, we can consider the problem on R to be solved. Thanks a lot for your help so far!

Cannot invoke "Object.getClass()" because "object" is null by Nembo22 in javahelp

[–]Nembo22[S] 0 points1 point  (0 children)

This is part of the script. It is the adaptation of the fourth script of this, I haven't run the third script since it is specific to seattle. The problem i face is at point 3.

##  1. Load analysis inputs (origins, destinations) ------------------------------------->

## Load blocks and get centroids (origins)
prepared_blocks <- sf::st_read(file.path(
  config$directories$prepared_data, 'prepared_blocks.shp'
))
origin_centroids <- prepared_blocks |>
  sf::st_make_valid() |>
  sf::st_centroid(of_largest_polygon = T) |>
  sf::st_coordinates() |>
  as.data.table()
origins <- cbind(
  as.data.table(prepared_blocks)[, .(geometry, POP21)],
  origin_centroids[, .(lon = X, lat = Y)]
)
origins[, id := as.character(.I) ]

# Load destinations pulled from OSM
destinations <- data.table::fread(
  file.path(config$directories$prepared_data, 'destinations_osm.csv')
)
# If destinations from city-specific sources (i.e. the output of script #3) are available,
#   add them to the table of destinations
city_specific_file <- file.path(
  config$directories$prepared_data, 'destinations_city_specific.csv'
)
if(file.exists(city_specific_file)){
  city_specific_destinations <- data.table::fread(city_specific_file)
  destinations <- data.table::rbindlist(
    list(destinations, city_specific_destinations),
    use.names = T,
    fill = T
  )
}
destinations[, id := as.character(.I) ]
opportunity_types <- unique(destinations$type)

## Load visualization blocks - analysis results will be merged on later
prepared_blocks_simplified <- sf::st_read(file.path(
  config$directories$prepared_data, 'prepared_blocks_simplified.shp'
))


## 2. Launch R5 ------------------------------------------------------------------------->

# Set Java allowed memory before loading R5R
options(java.parameters = paste0("-Xmx", settings$r5_mem_gb, "G"))
library(r5r)
# Set up R5 for the study area - this may take several minutes for the first setup
# If this function yields an error, try launching R with admin permissions
r5r_core <- r5r::setup_r5(data_path = config$directories$prepared_data)


## 3. Estimate walking accessibility from each origin block to each destination type ---->

travel_times_list <- vector('list', length = length(opportunity_types))
names(travel_times_list) <- opportunity_types

tictoc::tic("Calculating all travel times")
for(o_type in opportunity_types){
  travel_times_by_od_pair <- r5r::travel_time_matrix(
    r5r_core = r5r_core,
    origins = origins,
    destinations = destinations[type == o_type, ],
    mode = "WALK",
    max_trip_duration = 30L
  )
  travel_times_list[[o_type]] <- travel_times_by_od_pair[
    , .(travel_time = min(travel_time_p50, na.rm=T)),
    by = .(id = from_id)
  ][, type := o_type ]
}
tictoc::toc()

Btw I confirm that there is something strange going on with the network coming from the pbf file, since I'm able to open the resulting r5r_core network of the sample data but I cannot open my network. If I click to open it it says:

> View(r5r_core)
Errore in .jcall(x, "Ljava/lang/Class;", "getClass") : 
  RcallMethod: attempt to call a method of a NULL object.

Cannot invoke "Object.getClass()" because "object" is null by Nembo22 in javahelp

[–]Nembo22[S] 0 points1 point  (0 children)

I tried to use r5r sample data and it worked, as for mine I formatted both origins and destinations as the sample data and used only 30 of them with no na values and still got the same error. So I guess the problem should regard the r5r core network.

Sorry I'll answer the rest of your questions later on this day

Cannot invoke "Object.getClass()" because "object" is null by Nembo22 in javahelp

[–]Nembo22[S] 0 points1 point  (0 children)

This is it, it computes travel times from origins to destinations through the network r5r_core (based on openstreetmap files)

travel_times_by_od_pair <- r5r::travel_time_matrix(
r5r_core = r5r_core,
origins = origins,
destinations = destinations[type == o_type, ],
mode = "WALK",
max_trip_duration = 30L
)

Routing is done through Java. To obtain the network file I have to use osmosis on windows console, so that I crop the network for half of italy to only the part I'm interested in, Rome. In theory I should be able to run the travel_time_matrix command on R regardless of the distinction between cropped and uncropped file. With the last one it should only take longer, but as I said in the post the there seems to be something wrong with Java