Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when importing a File Graph - No uuid found for page name nil #191

Open
1 of 2 tasks
geosmina opened this issue Dec 18, 2024 · 9 comments
Open
1 of 2 tasks
Assignees
Labels

Comments

@geosmina
Copy link

Search first

  • I searched and no similar issues were found

What Happened?

The File Graph importer throws a message saying No uuid found for page name nil when I try to import a medium-sized file graph. In general, this error occurs more often with journal files, but there is a page where the same problem occurred.

Error Image

Reproduce the Bug

  1. Go to "Import existing notes".
  2. Select "File to DB Graph".
  3. Select the File Graph folder.
  4. Import
  5. The errors should appear in notification balloons.

Expected Behavior

No response

Files

The following file is a log from the Firefox Console: console-export-2024-12-18_12-31-51.txt

I've tested in Firefox and Chrome. The error applies to both browsers.

Browser, Desktop or Mobile Platform Information

Logseq DB 0.10.10 - 7c2d17e
Firefox 133.0 (64-bit)
Google Chrome 131.0.6778.108 (64-bit)

Additional Context

Some of the pages that trigger the error were created before the date they represent; e.g., the journal entry from 2024-02-08 was created on 2024-02-07. This is because I sometimes plan my tasks ahead of time by creating the future day journal instead of using the deadline feature. I do this because I plan to do those tasks on a specific date, not just because that day is the deadline. However, this does not explain a page that is not a journal entry having the same problem.

Are you willing to submit a PR? If you know how to fix the bug.

  • I'm willing to submit a PR (Thank you!)
@logseq-cldwalker
Copy link
Contributor

Hi. As requested in the issue form, could you provide the 2 journal files that are mentioned in the error? You can email them to [email protected] if you'd prefer to not attach them here

@geosmina
Copy link
Author

Files sent by email for privacy reasons. If you need more, please let me know.

@logseq-cldwalker logseq-cldwalker self-assigned this Dec 18, 2024
@logseq-cldwalker logseq-cldwalker added the can't reproduce Can't reproduce. Issue will be closed if reply is not received label Dec 19, 2024
@logseq-cldwalker
Copy link
Contributor

Thanks for the files but unfortunately I can't reproduce the error just with them. You could either privately send me your whole graph (and I assure you I will delete it) or you could try again on test.logseq.com in ~10 min and give me the latest console log with your errors. I've added some debugging to the console which could help narrow down the issue(s). Fyi after tomorrow I will be on holiday break

@geosmina
Copy link
Author

Here's the console log file. If that data isn't enough/useful, please let me know, and I'll send the whole graph.

console-export-2024-12-19_17-38-22.txt

@logseq-cldwalker
Copy link
Contributor

I don't have enough to repro still. If you could DM your graph that'd be helpful

@geosmina
Copy link
Author

Files sent by email for privacy reasons. Please be careful with my data 😄

@geosmina
Copy link
Author

I've found the pattern for the bug in the graph. The importer throws that error message only when pages have backlinks to the following pages:

  • Propranolol;
  • Risperidona;
  • Pearson.

I've tried a few workarounds, but none of them worked. I've deleted the pages, deleted and created them again, reindexed the graph...

The only thing that worked was to remove the links ([[]]) and leave only the text. If I add any backlink to any of those terms, it throws the error again and not import the files.

logseq-cldwalker added a commit to logseq/logseq that referenced this issue Jan 10, 2025
page name has same name as a task. Importer failed because of invalid
refs coming from gp-block/with-page-refs-and-tags which were caused
by get-first-page-by-title returning blocks. In 4f368d5,
get-first-page-by-title started returning blocks instead of only pages
so this was undone.  Fixes part of logseq/db-test#191
@logseq-cldwalker logseq-cldwalker added can reproduce bug Something isn't working and removed can't reproduce Can't reproduce. Issue will be closed if reply is not received labels Jan 10, 2025
logseq-cldwalker added a commit to logseq/logseq that referenced this issue Jan 13, 2025
rather than one notification per file. Also ignore pdf highlight pages
as user graphs shouldn't fail hard on features that aren't imported yet.
This allowed the user graph in
logseq/db-test#191 to import without errors
@logseq-cldwalker
Copy link
Contributor

Thanks for the tips and sharing your graph. I'm now able to import your graph without failing. There are a few validation errors but those are to be expected until support for pdf highlights is added. Once you can confirm this is fixed, I'll delete your graph from my computer

@kerim
Copy link

kerim commented Jan 14, 2025

I can confirm that this fix greatly reduced the number of "No uuid" errors for me on import. I still get 15 but 13 of those are from the metadata created during Zotero imports. Specifically, I have page properties in the MD document that look like this:

tags:: [[Social Science / Anthropology / Cultural & Social]]

tags:: [[History / Asia / Japan]], [[Social Science / Indigenous Studies]], [[✅]]

tags:: [[/unread]], [[Philosophy / Political]], [[Political Science / History & Theory]], [[Social Science / Sociology / Social Theory]]

tags:: [[/unread]], [[Art / Australian & Oceanian]], [[Social Science / Anthropology / Cultural & Social]], [[Social Science / Media Studies]]

To just give a few examples. These tags make sense in Zotero, but cause problems in Logseq when trying to import.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants