| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
Also update ImportController with latest import (chrome, firefox & instapaper).
|
|\
| |
| | |
Import Firefox & Chrome bookmarks into wallabag
|
| |
| |
| |
| |
| | |
With real data, the previous looks more than a Chrome converted file.
Also, fix date conversion (hope so).
|
| | |
|
| | |
|
|/
|
|
| |
Using `getScheduledEntityInsertions()` we can retrieve not yet flushed but already persisted entities and then avoid tags duplication on import.
|
|
|
|
| |
Instead of just say “Failed to save entry” we’ll save the entry at all cost and try to fetch content. If fetching content failed, the entry will still be saved at least, but without content.
|
| |
|
| |
|
|
|
|
|
| |
Add generated files from `composer up`
Add more articles for Readability tests
|
|
|
|
|
|
|
|
| |
- using javibravo/simpleue
- internal config value are now `import_with_redis` & `import_with_rabbit` which are more clear
- if both option are enable rabbit will be choosen
- services imports related to async are now splitted into 2 files: `redis.yml` & `rabbit.yml`
-
|
| |
|
|
|
|
| |
Also update Symfony deps
|
|
|
|
|
| |
5000 by 5000.
Also, retrieve newest item first.
|
|
|
|
|
| |
Instead of queing real Entry to process, we queue all the item to import from Pocket in a raw format.
Then, the worker retrieve that information, find / create the entry and save it.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
| |
graby will throw an Exception in some case (like a bad url, a restricted url or a secured pdf).
Import doesn't handle that case and break the whole import.
With that commit the import isn't stopped but the entry is just skipped.
Also, as a bonus, I've added extra test on WallabagImportV2 when the json is empty.
|
|
|