toggle quoted messageShow quoted text
These are relevant issues. However, for the time being, we have
restricted ourselves to data that relect averages with a duration
of minimum 1 year. This is because these are the typical data
used, and a larger granularity would risk an overcomplication
relative to the typical data in use in the domain. Nevertheless, I
believe that in the future, it will be relevant to allow more
flexibility here, and I think that will also be possible without
actually changing the ontology. The current restriction is not
ontological, just practical.
Den 2019-03-18 kl. 05.24 skrev
mmremolona via Groups.Io:
Sorry for not participating as much the past few weeks. I'm trying
to catch up with what everyone has said so far.
In terms of these competency questions. I guess the question that
Massimo is asking is with respect to time scales and time windows.
I'm not entirely familiar with the dataset that is available in
the domain, but these time scales for measurements can cause some
incongruity in the representation that is finally done in the
ontologies. I'm not sure if the questions I ask are of the type to
be included in these competency questions but my opinions are as
(MR_Q1) What is the time granularity of the data that we acquire?
This includes flow rates and production statistics. I also assume
this varies with the different sources of data. Some data may
already be averaged (Do we handle these differently?).
(MR_Q2) Are we going to aggregate data as part of the ontology
specification or is this left for other parts of the pipeline? And
if we are to aggregate data, to what degree and time scales? (per
hour, per day, per week - I think this depends on how often we
aggregate data and what data is available, I don't think a per
minute data is significant enough in the overall scheme of LCA but
I might be wrong)
As of now, these are the questions that came to my head as I'm
reading along the threads in this group. I'll post more ideas as I
come across them.