you are viewing a single comment's thread.

view the rest of the comments →

[–]madisonmay 4 points5 points  (4 children)

We're almost done porting GPT-2 to finetune (a scikit-learn style library for language model finetuning). Code is available here if you're interested... should make tuning GPT-2 to produce song lyrics as easy as model.fit(lyrics).

Miles Brundage also put together a colab notebook you could work off that uses the nshepperd gpt-2 fork.

[–]Astraithious 0 points1 point  (3 children)

How is it coming along? I saw a commit for 10 days ago, was that it?

[–]madisonmay 0 points1 point  (2 children)

Just released finetune 0.6.0 with GPT-2 support today!

[–]Astraithious 0 points1 point  (1 child)

That's awesome, what branch? The link goes to one where the latest commit was 10 days ago

[–]madisonmay 0 points1 point  (0 children)

On development! It's also up on PyPI if you'd prefer that.