Opened 7 days ago
Last modified 27 hours ago
#19456 new bug
[HaikuDepot] crashes with double free
Reported by: | diver | Owned by: | apl-haiku |
---|---|---|---|
Priority: | normal | Milestone: | Unscheduled |
Component: | Applications/HaikuDepot | Version: | R1/Development |
Keywords: | Cc: | ||
Blocked By: | Blocking: | ||
Platform: | All |
Description
Haiku is running in a cloud provider. Crashes on launch every time. Can't reproduce in vmware, tho.
CPU(s): 2x AMD EPYC Milan Memory: 4.00 GiB total, 477.09 MiB used Haiku revision: hrev58709 Mar 6 2025 07:02:45 (x86_64) Active Threads: thread 651: HaikuDepot (main) thread 658: Package Contents Populator thread 670: LocalPkgDataLoadProcess thread 671: team 651 debug task thread 665: w>HaikuDepot state: Call (double free 0x417c68a690) Frame IP Function Name ----------------------------------------------- 00000000 0x21d71c2d7f7 _kern_debugger + 0x7 Disassembly: _kern_debugger: 0x0000021d71c2d7f0: 48c7c0eb000000 mov $0xeb, %rax 0x0000021d71c2d7f7: 0f05 syscall <-- 0x7f75861a9cd0 0x21d71cb7867 wrterror + 0xa7 0x7f75861a9d60 0x21d71cb90c2 ofree.constprop.0 + 0x772 0x7f75861a9db0 0x21d71cba944 free + 0x74 0x7f75861a9de0 0xab68d2c822 BReferenceable::ReleaseReference() + 0x82 0x7f75861a9e00 0xab724fc378 PackageUtils::DepotName(BReference<PackageInfo> const&) + 0x48 0x7f75861a9e40 0xab724c1a2b PackageRow::UpdateRepository() + 0x1b 0x7f75861a9e70 0xab724c1cd0 PackageRow::PackageRow(BReference<PackageInfo> const&, PackageListener*) + 0xf0 0x7f75861a9ee0 0xab724c27a8 PackageListView::AddPackage(BReference<PackageInfo> const&) + 0x68 0x7f75861a9f40 0xab724ab45a MainWindow::_AddRemovePackageFromLists(BReference<PackageInfo> const&) + 0x8a 0x7f75861aa050 0xab724b04f2 MainWindow::MessageReceived(BMessage*) + 0x9b2 0x7f75861aa0e0 0xab68ccbd23 BWindow::task_looper() + 0x1d3 0x7f75861aa100 0xab68c0bf4b BLooper::_task0_(void*) + 0x1b 0x7f75861aa120 0x21d71c2c507 thread_entry + 0x17 00000000 0x7ffd0ae8c258 commpage_thread_exit + 0
Note:
See TracTickets
for help on using tickets.
Hello @diver; I have a re-write of that whole area coming shortly which will hopefully sort this out.