Skip to content
This repository has been archived by the owner on Oct 30, 2018. It is now read-only.

Confused by schema design found in test suite #17

Open
alexanderdean opened this issue Dec 9, 2013 · 1 comment
Open

Confused by schema design found in test suite #17

alexanderdean opened this issue Dec 9, 2013 · 1 comment

Comments

@alexanderdean
Copy link
Contributor

I'm confused by the schema design found in the HPaste test suite:

object ExampleSchema extends Schema {

  //There should only be one HBaseConfiguration object per process.  You'll probably want to manage that
  //instance yourself, so this library expects a reference to that instance.  It's implicitly injected into
  //the code, so the most convenient place to put it is right after you declare your Schema.
  implicit val conf = LocalCluster.getTestConfiguration

This approach is tightly-coupling a test Hadoop configuration into the schema object. Obviously this is fine for a project which will never be run on a real cluster, but what's the recommended approach for a schema which will be used "in anger", i.e. needs to support LocalCluster.getTestConfiguration and the real Hadoop cluster's Configuration? (Bearing in mind that implicit values in Scala can't cross object boundaries.)

@alexanderdean
Copy link
Contributor Author

To workaround this I've changed the schema definition to a class:

/**
 * Holds our HBase schema for lookups into the
 * API Transactions table.
 */
class ApiTransactionSchema(implicit conf: Configuration) extends Schema {

And then select the appropriate Hadoop Configuration in my calling code:

implicit val conf = // Still to write
val ats = new ApiTransactionSchema
val txn = ats.ApiTransactionTable.query2...

I must be missing something much cleaner though - what do you guys do?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant