PDA

View Full Version : Request: Global search



ffujita
February 26th, 2016, 22:43
Background: FG allows me to search within a category, within a tab.

What I'd like is for there to be a search that would go across categories, and across tabs, that would show (or at least link to) all the entries in all of the open modules.

The Token search does this, but only for Tokens. Even if the Token search functionality was applied to "Story" and "NPC" categories it'd be a start.

(I'm a 5e person) But really, what I'd like is to be able to search for "rapier," and get the weapon from the Player's Handbook, any magical rapiers in the Dungeon Master's Guide, any NPCs or PCs that are carrying rapiers, any reference information in the various modules that talk about rapiers, any pictures of rapiers in the Maps section, etc.

I know it's probably not possible (cause, why would you leave out something like this if it was easy to do) ... but I thought I'd ask anyway.

Thanks!

Moon Wizard
February 26th, 2016, 22:56
This is actually something that Doug and I have talked about quite a bit.

Here's the outcome of that discussion:

Campaign data is in one place in the database, and module data is in another place in the database. The current controls within FG allow a single data path, so it would have to be built in Lua which would be slower than the built-in controls.
Data paths for a given record type vary across rulesets, and can even vary between modules for a given ruleset. This could also apply to field names within each record (i.e. they can vary).
There are multiple "types" of each record in some cases. For example, in 5E, there are "item", "reference_weapon", "reference_armor", "reference_equipment", ...


The bottom line is that there is no simple drop-in way to do this. (I actually tried a simple approach, but it was confusing as to what was being searched exactly (and it wasn't everything) so we took it out.) It might take an overhaul of the way modules are built as well, which has some backward compatibility considerations too.

Longer term, I am looking at an overhaul of the way records are organized within the interface in order to be able to offer something like this in the future. However, it's more of a long term goal, so no ETA at this point.

Regards,
JPG

damned
February 26th, 2016, 23:20
It has been raised by someone else recently. I believe the answer was to search across different data types is going to be very difficult but better search within Categories will likely happen...

Oooops: Got sidetracked while answering!

MadCar_1
February 26th, 2016, 23:51
I wonder if a different approach might work here. What if you were to create a "Create Search Index" process that would walk through each of the modules and data types, creating a master (encrypted) data table. The creation process could be initiated by the user and/or could happen in the background.

This "Search Index Table" would contain all searchable text, associated datatype, and link to original reference. When a user conducts a global search, it would be executed against this generated table and results would be listed in 4 separate columns:
* Datatype / Module (Library Doc, NPC, Story, Table, Character, Note, ....)
* Name (Players Handbook, Bilbo, Room 5 - Kitchen, ....)
* found text in context of source document
* link to source reference

This should speed up the search process considerably. Thoughts?

LordEntrails
February 27th, 2016, 02:14
Search indexing requires the creation of an index, effectively another database. It has to be updated regularly. That would require one of three approaches;
1) automated with the start or close of FG. Not something most users would want to wait for.
2) automated with a background process. who wants another background process that runs all the time? Not I
3) manually, I could handle this, but I doubt most people would remember to update their index regularly

I would love a global search, but, I live pretty well without it.

I would rather see a couple of possible searches; campaign-wide & multi-module

MadCar_1
February 27th, 2016, 02:30
LordEntrails,

I agree with your assessment, would have to be a process kicked off manually, but could run in the background once started.

From an implementation basis, could be something only DMs or Ultimate License holders get access to.





MadCar

Moon Wizard
February 27th, 2016, 03:34
The FG database is an in memory tree, and none of the database nodes has an implied record type. This means that any node can be any record type, or no record type at all.

The only way to identify are its type is for the ruleset to specify that a given node should be treated as a given record type.

This makes for a very flexible platform, but kills any idea of indexing outside of the ruleset Lua code. Because the indexer would have no idea where each ruleset stores records of a specific type.

I was able to do a quick and dirty module search, but it was per category and didn't include campaign data. So, I want to step back and look at it from a comprehensive angle.

Also, the Unity version is moving to folder paradigm vs categories, so that will change implementation in the long term as well.

Cheers,
JPG

dulux-oz
February 27th, 2016, 03:39
Just out of interest, is the Unity version (vU) going to retain the memory tree structure or are you looking to implement a RDB and provide a method to import the "legacy" DB into the proposed new RDB?

damned
February 27th, 2016, 03:44
dont most RDBs require a DB engine to be installed potentially leading to more difficulties in the installation process?

dulux-oz
February 27th, 2016, 03:57
Yes, most RDBs do require an engine, but there are "engines" that are available as dll's, specifically designed for local installs (eg off of a cd/dvd-rom) which would work exceptionally well with lua's table-based architecture - RDBs are, by definition, tables.

I can envisage the host loading up the RDB into lua tables (as opposed to the memory tree that is currently created) and then sending the clients those.

Actually, that similar to the way that some of my Extensions work anyway - its how I managed to get the Internal Referential Integrity working for ther DOELocations, for eg.

Moon Wizard
February 27th, 2016, 04:06
We'll probably leave the DB as is.

We're heavily weighting against DLLs, because these provide challenges with cross platform support. Also, there are secure content considerations.

Plus, an RDB doesn't solve this situation anyway, since we'd most likely just be implementing the tree in the RDB for backward compatibility and flexibility.

Regards,
JPG

dulux-oz
February 27th, 2016, 04:18
Fair enough, I was just wondering.

The issues you mention are not in-solvable (and I'm not sure I agree with you about resolving the searching issue, either :) ), but one of the things that needs to be weighed up is the benefit-vs-cost argument of changing the architecture (or of not changing it, for that matter) and that's something I'm sure you've done in depth (or will do, if you haven't yet).

A caveat: one of the things I've seen done (by both large organisations and small) is get caught up in the idea of "sunk costs" where people don't want to go in another direction because they are so heavily invested in the existing one. An example of what I'm talking about is the people who are still struggling along on a Novel network because they didn't want to loose the investment in that when WindowsNT came along - in effect they kept on digging the hole they were in instead of starting a new hole, and now they can't get out. I'm not for a moment suggesting that that's what you're doing; I simply mention it as one of the things that people can overlook when considering changes/new tech/whatever. You've obviously not doing this because you're going to vU.

An interesting discussion, no? :)

damned
February 27th, 2016, 04:30
Fair enough, I was just wondering.

The issues you mention are not in-solvable (and I'm not sure I agree with you about resolving the searching issue, either :) ), but one of the things that needs to be weighed up is the benefit-vs-cost argument of changing the architecture (or of not changing it, for that matter) and that's something I'm sure you've done in depth (or will do, if you haven't yet).

A caveat: one of the things I've seen done (by both large organisations and small) is get caught up in the idea of "sunk costs" where people don't want to go in another direction because they are so heavily invested in the existing one. An example of what I'm talking about is the people who are still struggling along on a Novel network because they didn't want to loose the investment in that when WindowsNT came along - in effect they kept on digging the hole they were in instead of starting a new hole, and now they can't get out. I'm not for a moment suggesting that that's what you're doing; I simply mention it as one of the things that people can overlook when considering changes/new tech/whatever. You've obviously not doing this because you're going to vU.

An interesting discussion, no? :)

Whats this comparison to Novell?
Novell scaled so much better than windows. Had a Directory that scaled many factors larger than windows. Had a security architecture that made more sense than windows. Had better optimised disk access and read and write performance. However Novell was never a great application platform. This is what killed Novell. Small business didnt want and need both Windows and Novell. Big business did for a long time but eventually a combination of aggressive sales strategies by MS and a desire to reduce the number of platforms being supported by IS/IT departments (mostly because of the cost of having to support the skill sets to manage multiple platforms) caused Novell to get dropped. All the things that Novell did well Windows did well enough.

Windows NT barely made a dint in Netware sales. The Market was growing much faster than the market share was shrinking. AD on 2000+ was when the tide turned. Server 2000 with SQL 7 and Exchange 2000 was offering a Directory Service, a Database Platform and a maturing Groupware product.

dulux-oz
February 27th, 2016, 05:23
And that was the point of the example (I'm not having a go at Novel or Windows, simply trying to get across a point about potentially throwing good money after bad because people can't recognise that the bad money has become bad (gone off, if you prefer) when once it was good) - for whatever reason (that's not under discussion) those who stuck with Novell ended up having to change anyway after even more years of cost supporting Novel; the more cost effective thing to have done (some argue) is to take the plunge once the competitive platform was ready (W2K in this case) and spend the money ($Z) in year X then wait for Y years, spend the money ($a) to continue with the old system ($aY) and then have to spend $Z anyway in year X+Y, for a total cost of $(Z+aY). Of course, that doesn't take into account the maintenance costs of the Windows-platforms during those Y years ($bY) - its only by a thorougher analysis (and usually in hindsight) can you tell if $Yb < $Ya.

The trouble is there is a natural human tendency (an incorrect and non-intuitive one in most cases) to ALSO consider the money originally spent on the thing in the first place ($S) (the Novel infrastructure, in this case).

The argument also applies substituting time for money - people have a tendency to stick with something they've spent a lot of time and effort on even past the point when they should quite/change/whatever, simply because they HAVE spend so much time and effort. There comes a point when you should "cut your losses" but there is also a point where in determining this you really shouldn't add in your existing/previous losses to your calculations, because you may only need a small addition amount of effort ("don't cut your looses to soon") - we were taught that in Economic Management, and in (Military) Officer school, and in a number of other fields where the principle is the same. Sun Zu even mentions the principle in his "Art of War".

So maybe I should have chosen a better example initially :)