Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EstimateAnomalyLikelihoods sometimes fail (unstable) #517

Closed
Zbysekz opened this issue Jun 15, 2019 · 17 comments · Fixed by #541
Closed

EstimateAnomalyLikelihoods sometimes fail (unstable) #517

Zbysekz opened this issue Jun 15, 2019 · 17 comments · Fixed by #541

Comments

@Zbysekz
Copy link

Zbysekz commented Jun 15, 2019

When i run test (just command pytest in root folder) several times, sometimes it fails.
Below is one log:
`============================================================================ test session starts =============================================================================
platform linux -- Python 3.6.7, pytest-4.1.1, py-1.7.0, pluggy-0.8.1 -- /usr/bin/python
cachedir: .pytest_cache
rootdir: /media/Data/Data/HTM/nupic.cpp, inifile: setup.cfg
plugins: mock-1.10.0, cov-2.7.1
collected 188 items

bindings/py/tests/check_test.py::LoadBindingsTest::testImportBindingsExtensions PASSED [ 0%]
bindings/py/tests/check_test.py::LoadBindingsTest::testImportBindingsInstalled PASSED [ 1%]
bindings/py/tests/network_test.py::NetworkTest::testNetworkLinkTypeValidation PASSED [ 1%]
bindings/py/tests/network_test.py::NetworkTest::testParameters SKIPPED [ 2%]
bindings/py/tests/network_test.py::NetworkTest::testSerializationWithPyRegion SKIPPED [ 2%]
bindings/py/tests/network_test.py::NetworkTest::testSimpleTwoRegionNetworkIntrospection PASSED [ 3%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testEquals PASSED [ 3%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testNupicRandomPickling SKIPPED [ 4%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testPlatformSame PASSED [ 4%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSample PASSED [ 5%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSampleAll PASSED [ 5%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSampleBadDtype PASSED [ 6%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSampleNone PASSED [ 6%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSamplePopulationTooSmall PASSED [ 7%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSampleSequenceRaisesTypeError PASSED [ 7%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSampleWrongDimensionsPopulation PASSED [ 8%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testSerialization PASSED [ 9%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testShuffle PASSED [ 9%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testShuffleBadDtype PASSED [ 10%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testShuffleEmpty PASSED [ 10%]
bindings/py/tests/nupic_random_test.py::TestNupicRandom::testShuffleEmpty2 PASSED [ 11%]
bindings/py/tests/pyregion_test.py::PyRegionTest::testCallUnimplementedMethod PASSED [ 11%]
bindings/py/tests/pyregion_test.py::PyRegionTest::testNoInit PASSED [ 12%]
bindings/py/tests/pyregion_test.py::PyRegionTest::testUnimplementedAbstractMethods PASSED [ 12%]
bindings/py/tests/pyregion_test.py::PyRegionTest::testUnimplementedNotImplementedMethods PASSED [ 13%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testDenseToDense SKIPPED [ 13%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testDenseToDenseToDenseDelay SKIPPED [ 14%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testDenseToDenseToSparseDelay SKIPPED [ 14%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testDenseToSparse SKIPPED [ 15%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testSparseToDense SKIPPED [ 15%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testSparseToSparse SKIPPED [ 16%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testSparseToSparseToDenseDelay SKIPPED [ 17%]
bindings/py/tests/sparse_link_test.py::SparseLinkTest::testSparseToSparseToSparseDelay SKIPPED [ 17%]
bindings/py/tests/temporal_memory_test.py::TemporalMemoryBindingsTest::testIssue807 PASSED [ 18%]
bindings/py/tests/topology_test.py::TestTopology::testCoordinatesFromIndex PASSED [ 18%]
bindings/py/tests/topology_test.py::TestTopology::testDefaultTopology PASSED [ 19%]
bindings/py/tests/topology_test.py::TestTopology::testIndexFromCoordinates PASSED [ 19%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodDimensionOne PASSED [ 20%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodInMiddle1D PASSED [ 20%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodOfEnd2D PASSED [ 21%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodOfMiddle2D PASSED [ 21%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodOfOrigin1D PASSED [ 22%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodOfOrigin2D PASSED [ 22%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodOfOrigin3D PASSED [ 23%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodRadiusZero PASSED [ 23%]
bindings/py/tests/topology_test.py::TestTopology::testNeighborhoodWiderThanWorld PASSED [ 24%]
bindings/py/tests/topology_test.py::TestTopology::testNoTopology PASSED [ 25%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodDimensionOne PASSED [ 25%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodInMiddle1D PASSED [ 26%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodOfEnd2D PASSED [ 26%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodOfMiddle2D PASSED [ 27%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodOfOrigin1D PASSED [ 27%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodOfOrigin2D PASSED [ 28%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodOfOrigin3D PASSED [ 28%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodRadiusZero PASSED [ 29%]
bindings/py/tests/topology_test.py::TestTopology::testWrappingNeighborhoodWiderThanWorld PASSED [ 29%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testComputeComplex PASSED [ 30%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testComputeInferOrLearnOnly PASSED [ 30%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testExampleUsage PASSED [ 31%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testInitInvalidParams PASSED [ 31%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testInitialization PASSED [ 32%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testMissingRecords PASSED [ 32%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testMultiStepPredictions PASSED [ 33%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testMultistepSimple PASSED [ 34%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testMultistepSingleValue PASSED [ 34%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testOverlapPattern PASSED [ 35%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testPredictionDistribution PASSED [ 35%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testPredictionDistributionContinuousLearning PASSED [ 36%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testPredictionDistributionOverlap PASSED [ 36%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testPredictionMultipleCategories PASSED [ 37%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testSerialization SKIPPED [ 37%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testSingleValue PASSED [ 38%]
bindings/py/tests/algorithms/sdr_classifier_test.py::ClassifierTest::testSingleValue0Steps PASSED [ 38%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testCompute PASSED [ 39%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testGetConnectedCountsUint32 PASSED [ 39%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testGetConnectedCountsUint64 PASSED [ 40%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testGetConnectedSynapsesUint32 PASSED [ 40%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testGetConnectedSynapsesUint64 PASSED [ 41%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testGetPermanenceFloat32 PASSED [ 42%]
bindings/py/tests/algorithms/spatial_pooler_test.py::SpatialPoolerTest::testGetPermanenceFloat64 PASSED [ 42%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testAverageOverlap PASSED [ 43%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testConstructor PASSED [ 43%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testDeterminism PASSED [ 44%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testErrorChecks PASSED [ 44%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testPickle SKIPPED [ 45%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testRadiusResolution PASSED [ 45%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testRandomOverlap PASSED [ 46%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testSeed PASSED [ 46%]
bindings/py/tests/encoders/rdse_test.py::RDSE_Test::testSparsityActiveBits PASSED [ 47%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testBadEncode PASSED [ 47%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testBadParameters PASSED [ 48%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testCategories PASSED [ 48%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testClipInput PASSED [ 49%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testConstructor PASSED [ 50%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testEncode PASSED [ 50%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testNaNs PASSED [ 51%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testPeriodic PASSED [ 51%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testPickle SKIPPED [ 52%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testRadius PASSED [ 52%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testResolution PASSED [ 53%]
bindings/py/tests/encoders/scalar_encoder_test.py::ScalarEncoder_Test::testStatistics PASSED [ 53%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testAF_Example PASSED [ 54%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testAF_initializeToValue PASSED [ 54%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testMetricsExample PASSED [ 55%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testOverlapExample PASSED [ 55%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testReset PASSED [ 56%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testSparsityConstructor PASSED [ 56%]
bindings/py/tests/sdr/Metrics_test.py::MetricsTest::testSparsityExample PASSED [ 57%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testAddNoise PASSED [ 57%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testConstructor PASSED [ 58%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testCoordinates PASSED [ 59%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testDense PASSED [ 59%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testDenseInplace PASSED [ 60%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testExampleUsage PASSED [ 60%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testGetOverlap PASSED [ 61%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testGetSparsity PASSED [ 61%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testGetSum PASSED [ 62%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testKeepAlive PASSED [ 62%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testKillCells PASSED [ 63%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testPickle SKIPPED [ 63%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testRandomRNG PASSED [ 64%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testRandomizeEqNe PASSED [ 64%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testRandomizeReturn PASSED [ 65%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testReshape PASSED [ 65%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testSetSDR PASSED [ 66%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testSparse PASSED [ 67%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testStr PASSED [ 67%]
bindings/py/tests/sdr/SDR_test.py::SdrTest::testZero PASSED [ 68%]
bindings/py/tests/sdr/SDR_test.py::IntersectionTest::testExampleUsage PASSED [ 68%]
bindings/py/tests/sdr/SDR_test.py::IntersectionTest::testInPlace PASSED [ 69%]
bindings/py/tests/sdr/SDR_test.py::IntersectionTest::testReturn PASSED [ 69%]
bindings/py/tests/sdr/SDR_test.py::IntersectionTest::testSparsity PASSED [ 70%]
bindings/py/tests/sdr/SDR_test.py::UnionTest::testExampleUsage PASSED [ 70%]
bindings/py/tests/sdr/SDR_test.py::UnionTest::testInPlace PASSED [ 71%]
bindings/py/tests/sdr/SDR_test.py::UnionTest::testReturn PASSED [ 71%]
bindings/py/tests/sdr/SDR_test.py::UnionTest::testSparsity PASSED [ 72%]
bindings/py/tests/sdr/SDR_test.py::ConcatenationTest::testConstructorErrors PASSED [ 72%]
bindings/py/tests/sdr/SDR_test.py::ConcatenationTest::testExampleUsage PASSED [ 73%]
bindings/py/tests/sdr/SDR_test.py::ConcatenationTest::testMirroring PASSED [ 73%]
bindings/py/tests/sdr/SDR_test.py::ConcatenationTest::testReturn PASSED [ 74%]
bindings/py/tests/sdr/SDR_test.py::ConcatenationTest::testVersusNumpy PASSED [ 75%]
py/tests/utils_test.py::UtilsTest::testEquals PASSED [ 75%]
py/tests/utils_test.py::UtilsTest::testMovingAverage PASSED [ 76%]
py/tests/utils_test.py::UtilsTest::testMovingAverageInstance PASSED [ 76%]
py/tests/utils_test.py::UtilsTest::testMovingAverageReadWrite SKIPPED [ 77%]
py/tests/utils_test.py::UtilsTest::testMovingAverageSlidingWindowInit PASSED [ 77%]
py/tests/utils_test.py::UtilsTest::testSerialization PASSED [ 78%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseContinuousBunchesOfSpikes PASSED [ 78%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseIncreasedAnomalyScore PASSED [ 79%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseIncreasedSpikeFrequency PASSED [ 79%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseMissingBunchesOfSpikes SKIPPED [ 80%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseMissingSpike SKIPPED [ 80%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseSingleSpike PASSED [ 81%]
py/tests/algorithms/anomaly_likelihood_jeff_test.py::ArtificialAnomalyTest::testCaseUnusuallyHighSpikeFrequency PASSED [ 81%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testAnomalyProbabilityResultsDuringProbationaryPeriod PASSED [ 82%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testCalcSkipRecords PASSED [ 82%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testEquals PASSED [ 83%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testHistoricWindowSize PASSED [ 84%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testReestimationPeriodArg PASSED [ 84%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testSerialization PASSED [ 85%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodClassTest::testdWindowSizeImpactOnEstimateAnomalyLikelihoodsArgs PASSED [ 85%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testBadParams PASSED [ 86%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testEstimateAnomalyLikelihoods PASSED [ 86%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testEstimateAnomalyLikelihoodsCategoryValues PASSED [ 87%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testEstimateAnomalyLikelihoodsMalformedRecords PASSED [ 87%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testEstimateNormal PASSED [ 88%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testFilterLikelihodsInputType PASSED [ 88%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testFilterLikelihoods PASSED [ 89%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testFlatAnomalyScores PASSED [ 89%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testFlatMetricScores PASSED [ 90%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testNormalProbability PASSED [ 90%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testSampleDistribution PASSED [ 91%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testSkipRecords PASSED [ 92%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testUpdateAnomalyLikelihoods PASSED [ 92%]
py/tests/algorithms/anomaly_likelihood_test.py::AnomalyLikelihoodAlgorithmTest::testVeryFewScores FAILED [ 93%]
py/tests/encoders/date_test.py::DateEncoderTest::testDateEncoder PASSED [ 93%]
py/tests/encoders/date_test.py::DateEncoderTest::testDayOfWeek PASSED [ 94%]
py/tests/encoders/date_test.py::DateEncoderTest::testHoliday PASSED [ 94%]
py/tests/encoders/date_test.py::DateEncoderTest::testHolidayMultiple PASSED [ 95%]
py/tests/encoders/date_test.py::DateEncoderTest::testMissingValues PASSED [ 95%]
py/tests/encoders/date_test.py::DateEncoderTest::testSeason PASSED [ 96%]
py/tests/encoders/date_test.py::DateEncoderTest::testTime PASSED [ 96%]
py/tests/encoders/date_test.py::DateEncoderTest::testWeekend PASSED [ 97%]
py/tests/encoders/date_test.py::DateEncoderTest::testYearsDiffer SKIPPED [ 97%]
py/tests/encoders/grid_cell_test.py::GridCellEncoder_Test::testDeterminism PASSED [ 98%]
py/tests/encoders/grid_cell_test.py::GridCellEncoder_Test::testNan PASSED [ 98%]
py/tests/encoders/grid_cell_test.py::GridCellEncoder_Test::testSeed PASSED [ 99%]
py/tests/encoders/grid_cell_test.py::GridCellEncoder_Test::testStatistics PASSED [100%]Coverage.py warning: Module /bindings/py/tests/ was never imported. (module-not-imported)

================================================================================== FAILURES ==================================================================================
______________________________________________________________ AnomalyLikelihoodAlgorithmTest.testVeryFewScores ______________________________________________________________

self = <anomaly_likelihood_test.AnomalyLikelihoodAlgorithmTest testMethod=testVeryFewScores>

def testVeryFewScores(self):
  """
  This calls estimateAnomalyLikelihoods and updateAnomalyLikelihoods
  with one or no scores.
  """

  # Generate an estimate using two data points
  data1 = _generateSampleData(mean=42.0, variance=1e-10)

  _, _, estimatorParams = (
    an.estimateAnomalyLikelihoods(data1[0:2])
  )

  self.assertTrue(an.isValidEstimatorParams(estimatorParams))

  # Check that the estimated mean is that value
  dParams = estimatorParams["distribution"]
self.assertWithinEpsilon(dParams["mean"], data1[0][2])

/media/Data/Data/HTM/nupic.cpp/py/tests/algorithms/anomaly_likelihood_test.py:715:


/media/Data/Data/HTM/nupic.cpp/py/tests/algorithms/anomaly_likelihood_test.py:336: in assertWithinEpsilon
"Values %g and %g are not within %g" % (a, b, epsilon))
E AssertionError: 41.50000323889616 not less than or equal to 0.005 : Values 0.5 and 42 are not within 0.005
------------------------------------------------- generated xml file: /media/Data/Data/HTM/nupic.cpp/junit-test-results.xml --------------------------------------------------

----------- coverage: platform linux, python 3.6.7-final-0 -----------
Coverage HTML written to dir htmlcov

============================================================== 1 failed, 168 passed, 19 skipped in 2.28 seconds ==============================================================`

@Zbysekz
Copy link
Author

Zbysekz commented Jun 15, 2019

And again with different assert error in same function:
`================================================================================== FAILURES ==================================================================================
_______________________________________________________ AnomalyLikelihoodAlgorithmTest.testEstimateAnomalyLikelihoods ________________________________________________________

self = <anomaly_likelihood_test.AnomalyLikelihoodAlgorithmTest testMethod=testEstimateAnomalyLikelihoods>

def testEstimateAnomalyLikelihoods(self):
  """
  This calls estimateAnomalyLikelihoods to estimate the distribution on fake
  data and validates the results
  """

  # Generate an estimate using fake distribution of anomaly scores.
  data1 = _generateSampleData(mean=0.2)

  likelihoods, avgRecordList, estimatorParams = (
    an.estimateAnomalyLikelihoods(data1[0:1000])
  )
  self.assertEqual(len(likelihoods), 1000)
  self.assertEqual(len(avgRecordList), 1000)
  self.assertTrue(an.isValidEstimatorParams(estimatorParams))

  # Check that the sum is correct
  avgParams = estimatorParams["movingAverage"]
  total = 0
  for v in avgRecordList:
    total = total + v[2]
  self.assertTrue(avgParams["total"], total)

  # Check that the estimated mean is correct
  dParams = estimatorParams["distribution"]
  self.assertWithinEpsilon(dParams["mean"],
                           total / float(len(avgRecordList)))

  # Number of points with lower than 2% probability should be pretty low
  # but not zero. Can't use exact 2% here due to random variations
self.assertLessEqual(numpy.sum(likelihoods < 0.02), 50)

E AssertionError: 52 not less than or equal to 50

/media/Data/Data/HTM/nupic.cpp/py/tests/algorithms/anomaly_likelihood_test.py:458: AssertionError
------------------------------------------------- generated xml file: /media/Data/Data/HTM/nupic.cpp/junit-test-results.xml --------------------------------------------------

----------- coverage: platform linux, python 3.6.7-final-0 -----------
Coverage HTML written to dir htmlcov

============================================================== 1 failed, 168 passed, 19 skipped in 2.34 seconds ==============================================================`

@Zbysekz
Copy link
Author

Zbysekz commented Jun 15, 2019

I just want to point out to this, not sure if this is known or it is connected with some unfinished work...

@dkeeney
Copy link

dkeeney commented Jun 15, 2019

Yes, I hit the same type of instability in PR #518
My errors were different but same test.

@ctrl-z-9000-times
Copy link
Collaborator

Yes, I've seen this test fail quite a lot.

@breznak
Copy link
Member

breznak commented Jun 17, 2019

self.assertLessEqual(numpy.sum(likelihoods < 0.02), 50)

so we could relax this test a bit, either use even smaller probability < 0.015 or increase the number of samples , 60) ?

@Zbysekz
Copy link
Author

Zbysekz commented Jun 17, 2019

Do you assume, that this is not a bug? I think proper way how to handle this will be:

  • Make anomaly_likelihood_test.py deterministic (import random class from c++ - as it is done in grid_cell_encoder.py)

  • figure out if it is correct behaviour

I started working on the first point already... :)

Zbysekz added a commit to Zbysekz/htm.core that referenced this issue Jun 17, 2019
@breznak
Copy link
Member

breznak commented Jun 18, 2019

Do you assume, that this is not a bug?

hey, we do not entirely trust the Likelihood implementations (c++, py), the code has "just" been ported, but very much needs proper code review (and maybe rewrite), validation, and test by trial/use-cases.
See #469

data1 = data1 + (_generateSampleData(mean=0.9,seed=rnd.getUInt32())[0:200])

the patch looks good 👍
just a word, for the seed= above, use =:positive integer: to get a fixed random sequence (mostly used for tests), or =0 to get random(seeded) random sequence, similar to what you get in the code above

Looking forward to your findings with likelihood!

@Zbysekz
Copy link
Author

Zbysekz commented Jun 18, 2019

Ok, i understand, we should test it and do as you say.

about the seed=, and putting there just a positive integer,
you suggest this just for line 520 or also for the rest?
Because now i a can change all seeds just by changing the GLOBAL_TEST_SEED...

@breznak
Copy link
Member

breznak commented Jun 18, 2019

you suggest this just for line 520 or also for the rest?
Because now i a can change all seeds just by changing the GLOBAL_TEST_SEED...

for all seeds/random instances. The global test seed would work fine there.
But I got mistaken, the seed I was talking about is for cpp nupic::Random (which can be used in Python), but you're using rng (=RandomState) which is used for other functions (normal, ...)

Can you open a PR to make commenting easier?

@Zbysekz
Copy link
Author

Zbysekz commented Jun 19, 2019

Unstable tests:

  • testVeryFewScores()
  • testEstimateAnomalyLikelihoods()
  • testSkipRecords()

all these are testing estimateAnomalyLikelihoods()

@Zbysekz
Copy link
Author

Zbysekz commented Jun 19, 2019

I got it! For testVeryFewScores() it is because of this "HACK"

https://github.com/htm-community/nupic.cpp/blob/efce15f87c644021ec59f76d2d1044babd323da7/py/htm/algorithms/anomaly_likelihood.py#L365-L378

Shortly described, estimateAnomalyLikelihoods() is called from testVeryFewScores() with just two anomaly scores. That causes that variance is sometimes lower than 1.5e-5 so nullDistribution is returned.
But nullDistribution has mean value permanently set to 0.5, so that's why it throws on
https://github.com/htm-community/nupic.cpp/blob/efce15f87c644021ec59f76d2d1044babd323da7/py/tests/algorithms/anomaly_likelihood_test.py#L711
Because mean of the test data is around 42.

@Zbysekz
Copy link
Author

Zbysekz commented Jun 26, 2019

For testEstimateAnomalyLikelihoods() it is just because of variations in random data and these fixed thresholds:
https://github.com/htm-community/nupic.cpp/blob/69555d7396d05c4af1e02199141d047963e3ccfc/py/tests/algorithms/anomaly_likelihood_test.py#L452-L454

Here are plots for 100 different seeds

obrazek

and for 1000 different seeds

obrazek

@Zbysekz
Copy link
Author

Zbysekz commented Jun 26, 2019

For testSkipRecords() it is also because of variations in small amount of random data
https://github.com/htm-community/nupic.cpp/blob/69555d7396d05c4af1e02199141d047963e3ccfc/py/tests/algorithms/anomaly_likelihood_test.py#L517-L519

Here are plots for 100 different seeds

obrazek

and for 1000 different seeds

obrazek

@Zbysekz
Copy link
Author

Zbysekz commented Jun 26, 2019

So thats enough for playing i think. It behaves as expected.
We now have two options, either adjust thresholds or just let it as is because we use seed=1 and all tests are passing. The second has one catch, because random might be platform specific...

What do you think?

@breznak
Copy link
Member

breznak commented Jun 26, 2019

either adjust thresholds or just let it as is because we use seed=1 and all tests are passing. The second has one catch, because random might be platform specific...

I'd think the adjusted thresholds (decrease the likelihoods from 0.02?, increase the number of samples under that threshold, 50?) would be better, as it's not a "hack" for only a given seed. Unless the values need to become ridiculous/useless just to pass the tests

@ctrl-z-9000-times
Copy link
Collaborator

That's a good analysis.

I think the next steps of analyzing this data would be to:

  1. Use a histogram instead of a line plot, to verify that the data falls in a normal distribution.
  2. Find the mean & standard deviation of the data.
  3. Set the test thresholds to the (1 / 1000)'th percentile of the data.

Or, you can of course use a constant seed. A constant seed won't randomly fail. IIRC many of these unit tests use a constant seed.

@ctrl-z-9000-times
Copy link
Collaborator

Fix by PR #525

Thanks @Zbysekz for working on this issue! I hope you can keep using and contributing to this project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants