David H.
David H.
As I know, all scaling feature is deprecated in `docker-compose` for years... So I don't think there is any magic that we can scaling out by simply adding some more...
> the ecosystem is a little confusing right now, because there's also now `docker compose` built into `docker` Oh... I thought docker swarm was End-Of-Life something :see_no_evil:
I always think Kubernetes is too heavy for small clusters to maintain (Myth?). I prefer `docker swarm` in this case.
@colinmegill I love the idea of `automated conversations` >There is also potential benefit in 'automated conversations': if there is no human in the loop, conversations can be triggered procedurally, say...
> I'd like to help. From what I've seen, you've got relatively brief, semi-conversational snippets of text, which are obtained from comment threads. Is that roughly correct as a description?...
I saw this (GPT-2 similar) claiming that they can do unsupervised clustering for texts (for Simplified Chinese texts) https://github.com/TsinghuaAI/CPM For Traditional Chinese it is possible to convert it to Simplified...
> I did a quick count on GPT-3's tagging of the comments in the Bowling Green dataset: https://gist.github.com/colinmegill/7714eb0962573346b210aa989e14dadf > > Here are the human generated categories: > > ``` >...
> https://twitter.com/Nils_Reimers/status/1487014195568775173 Wonder why GPT-3 do encoding things, isn't it the "Decoder" part of a transformer? I would use BERT things to do encoding as they are the "Encoder" part...
From a technical perspective, Universal Sentence Encoder (USE) and BERT are great, both including the encoder part of transformer I think that explains why they are better on encoding strings...
However, in the view of a product. It is true that for people who aware of this data collection may question about privacy and robustness. Human do afraid of things...