General stats
Users
Comments map
Printable speech
Words cloud
Back to home
All speeches
Localising the User-Centricity Principles
Last update 17 Sep 2021
82 paragraphs, 93 comments
Localising the User-Centricity Principles (IT)
Last update 17 Sep 2021
91 paragraphs, 26 comments
Localising the User-Centricity Principles (NL)
Last update 17 Sep 2021
89 paragraphs, 10 comments
Localising the User-Centricity Principles (FR)
Last update 17 Sep 2021
92 paragraphs, 0 comments
Localising the User-Centricity Principles (ES)
Last update 17 Sep 2021
91 paragraphs, 0 comments
Localising the User-Centricity Principles (FIN)
Last update 17 Sep 2021
91 paragraphs, 0 comments
Localising the User-Centricity Principles (DE)
Last update 17 Sep 2021
91 paragraphs, 0 comments
List of indicators for UserCentriCities dashboard
Last update 20 Sep 2021
67 paragraphs, 132 comments
Indicators list - second iteration
Last update 24 Nov 2021
79 paragraphs, 85 comments
Speech words cloud
1. ENABLERS 1.1 Skills * The number of professionals with user experience / user research / service design skills * The number of ICT specialists employed by the local authority * The share of citizens with at least basic digital skills at local level (or national level, based on data availability) * Number of civil servants that received training in ICT * Total number of employees of the local authority * Any additional indicators (and sources)? 1.2 Incentives * Does the local authority offer incentives for citizens to use digital services (such as lower costs for online)? * Does the local authority provide onboarding of citizens in digital services? * The number of citizens that received training on digital public service use * Any additional indicators (and sources)? 1.3 Ecosystem * The number of APIs provided by the local authority to other administrations and to private companies * The monthly number of calls to APIs developed by the local authority * Number of standardised services modules in use, provided by higher institutional bodies (e.g. national payment service or ID) * Any additional indicators (and sources)? 1.4 Participation * Has the local authority adopted service co-design / user research practices? * The number of citizens that have engaged in service co-design practices in the past year * (alternative indicator) The share of citizens that have engaged in service co-design practices in the past year * The number of digital participation initiative offered by the local authority to citizens to engage them in the creation of new policies * Any additional indicators (and sources)? 2. USER-CENTRICITY PERFORMANCE 2.1 Supply of services * The share of services available online out of total services provided that could potentially be digitalised * Do the online services provided align with the requirements of the Single Digital Gateway? * The share of online services that offer the possibility to check their online status (e.g. active or inactive) * Any additional indicators (and sources)? 2.2 Usability * Are the local authority web services in line with WCAG guidelines? * Are the online services developed through constant iteration and changed based on behaviour of citizens? * The share of digital services that are fully in line with WCAG guidelines * Has the local authority put in place means for users to provide feedback? * Does the authority has simple language standards? * Does the authority has service standards in place? * Any additional indicators (and sources)? 2.3 Security and privacy * Are the users able to use eID, recognised under the eIDAS directive, as a means of authentication for all online services requiring authentication? * The share of citizens with an eIDAS compliant eID at local level (e.g. city, municipality, administrative region etc.) * Has the local authority put in place channels through which citizens can access and ask for the correction and deletion of their personal data? * Any additional indicators (and sources)? 2.4 Redress mechanisms * Does the local authority provide online redress mechanisms for both citizens and businesses? * Any additional indicators (and sources)? 3. OUTCOME 3.1 Adoption * The share of transactions executed online out of total transactions (online and offline) executed * The share of citizens possessing an eID that have made use of it at least once * The share of population by specific segments * Does the local authority publish adoption data in real time? * The three most popular services and their adoption rates * The share of online transactions completed out of the total transactions attempted (the completion rate) * Any additional indicators (and sources)? 3.2 Reduction of administrative burden * Does the local authority provide pre-filled forms? * Does the local authority offer at least one proactive service? * The average time saved by citizens when using an online service compared to the offline one * The estimated amount of annual financial savings for the public administration * Any additional indicators (and sources)? 3.3 Satisfaction * Does the local authority measure the citizens’ level of satisfaction with regards to the services’ provision? * The share of satisfied users * Any additional indicators (and sources)? 3.4 Impact * Annual estimate of the volume of CO2 saved by the citizens when using an online service compared to the offline one * Any additional indicators (and sources)?
Comments words cloud
Number can be misleading also since cities use outsourcing in ICT-services, number of specialist on the city payroll doesn't give the whole picture about the capabilities city has in use. "Basic skills" needs to be defined clearly, so that the data is comparable. This theme have been examined on national level surveys, and occasionally on local level, but can be troublesome to doWhat is meant by training needs to be defined clearlyAgreed, this should be background information (such as the number of citizens) about the cities provided on the platform/dashboard. Related to skills is also if the employees have skills to give digital support to customers (number of..?) Also language skills have shown to be relevant when giving guidance/assistance about use of digital services. Needs to be clarified what is included in incentives. Training provided by the local authority? What is counted as training (IT-education given in schools?) Digital service use training and support is given not only by the cities but by the local ecosystem. Also, services should be so easy to use that they don't require training, what is measured with this indicator? This is a good indicator to measure the capability to work with the ecosystem. Paola's point is relevant though. Background information about the size and service portfolio of different PA's need to be given if there is number based indicators. Developed or provided by? Cities may not develop the API's while they do provide them. Share of API’s connecting to companies/to public sector/ to 3rd sector Co-design practices need to be defined to ensure comparability. At least in Espoo this can be troublesome to get accurately, since the practices and hence the information is scattered to all departments. Is the focus here especially with digital participation? The previous is about service co-design and this is about policies. Should/could be both about services and policies if the difference is about participation online/offline?These kind of standard based indicators are goodWhat is meant by status of the service? Status of a customers single service process (service ticket) or the status of the service in a bigger picture like que status or if open or not? Or ICT perspective active/inactive status?This is problematic phrasing of an important question. First, there are two different actions in one indicator. Both need to be yes? What is considered to be constant in the world of developing public services? Could it be more about systematic approach on utilizing the feedback? "Are the online services developed through systematic interaction with the customers and feedback gathered? What standards are considered here? The ones PA's have made for themselves for example for the time to handle feedback etc?Has the local authority put in place a risk mitigation and risk management processes, governance in place in case of miss use or attacks on digital customer services? Does the local authority has a communication plan in case the digital customer services are unavailable or has to be shot down?This can be difficult to get (city level) and requires manual labor. data needs to be gathered from many different systems and/or excels used by service units.This seems bit irrelevant in Finnish municipality context. Data comes from national level. What would be the spesific segments? Demographic segments won't tell much about the adaption of/capability to use online services. Others can be difficult to define and measure in a comparable manner What this means? Adoption rate in comparison to what? Share of potential users? What does the real time publishing indicate?Proactive service needs to be defined to ensure comparabilityAnd how is it measured. That is also relevant to the next indicator to know how comparable the numbers are.Impact evaluation should have more of a systemic approach. The other aspects of sustainability (social, economic, cultural) should be considered as well. Couple of concluding comments from Espoo: - All the terms and concepts need to be clearly defined and written open. Comparability requires that we measure same things and understand them the same way. - Less is more, especially at the beginning. It's better to have few clear indicators that we can get good data from. Few "basic" indicators can also lower the step for other cities to join. - Indicators that are based on EU standards such as WCAG or Single Digital Gateway are good - To get the data for some of the indicators that are presented here require manual labor and aren't simple to provide when systems are siloed (as they in cities often are)We should add an indicator regarding educational training for citizens. The level of their digital skills is a fundamental enabler and shall be nurturedanother incentive is the carbon efficiency, to be measured, of using a digital service instead of accessing it on premises Great indicator perhaps to be seen as an enabler and not an incentive.We should add an indicator on the number/share of services available on mobile/mobile app.Is sufficient to publish the adoption rate on a regular basis (monthly for example). real-time publishment might be an effort that doesn't necessarily paybackI would move it under impact. In the end, one of the significant impacts for citizens is the time savedalso referring to the questions/suggestions above: what will be the (objevctive) indicator? who counts as "professional with skills"? will I? no diploma's, just experience. do we need a questionaire to assess these things?i would say as indicator: percentage of annual budget towards internal and external expertise booked on IT projects?i think there are baselines/data for this at least nationallylook into the issue: how to make this objective. For instance: in some countries teachers are counted as employees, in other countries they are not. is user centricity as a principle specifically mentioned in public service/IT strategyi think this is tricky because it matters how cities approach this. I dont think we (can?) give financial incentives but i would look at the "gains" in a broader perspective. first time right, waiting times, speedshould and can it be measured to what extend the ecosystem is "open (source)"? the use of open source/open standards? maybe make this a yes/no indicator: does a local governmement have at least 1 certified/educated service designer? (or am i talking nonsense now)number of public publications on participatory design acvtivities?cool i guess?number of domains a local governement uses to provide online services: easy to find? yes, and they should i guess?maybe as an indicator the level of auditted services (objective measures)assess if there is capacity for iteration? how much budget relatively to annual (IT) budget?SUS assesment/NPS for online services measured?NPS/SUS scores should be used?would love this!for now too complicated, but it could be reasoned that better (online) services for social support could lead to earlier signalling and intervention, thus leading to less damage to society (SROI). this could be the case for public health and prevention strategies, debt support, mental health strategies, children's and YP strategies, employement and social welfare sytems in place.for now too complicated, but it could be reasoned that better (online) services for social support could lead to earlier signalling and intervention, thus leading to less damage to society (SROI). this could be the case for public health and prevention strategies, debt support, mental health strategies, children's and YP strategies, employement and social welfare sytems in place.Measuring in number may be misleading as it depend on the size of the local authority. Maybe an indicator based on Yes/No related to the involvement of such professional in the process of designing services can be more significantAlso in this case the number can be misleading as it depends on the size of the local PAIs this a responsibility of local PA? Good digital services should be those that can be used also with low level of digital skillsmaybe it could be more significant if related to local PA officersThis one shouldn't be an indicator as it is not related to the quality of provided servicesMaybe number of initiatives for digital competencies improvementindicator based on number is still problematic as it depends on the size of the local PA and the complexity of the offered services. Also in this case the indicator depends on the size of local PA. We should think at indicator that can show excellence also for small local PAAgain the number depends on the specific national implementation. we could change it in yes/no for each of the module that are key for services digitalizationeIDAS is a RegulationThis indicator should be related to the services that have both an online and offline channelThis indicator is not 100% clearThis one doesn't sound as an indicatorI would hesitate to recommend lower costs or similar as an incentive for people to use digital services as this can quickly become discriminatory. Further, in the case of Helsinki, mandatory services provided by the city must be equally accessible to everyone, whether or not they are able or willing to access them digitally.Recommend adding that citizens are able to request their data in a commonly used and machine-readable format (cf. GDPR art. 20).Major issue with HR resources is not only the number, but the quality of ICT / other relevant specialists. The cities have a challenge in finding talented people, and are competing on them against the industry. Yet, outsourcing does not help since even consulting companies struggle to provide highly skilled specialists for some tasks.a numerical indicator could for example be number of professionals with certain level of a degree (Bachelors, Masters, higher)Is attribute based data policy & access applied to ensure authorized data usage? Is data usage monitored and logged? Can the registry keeper verify who has accessed and what data?Also: are their training being kept up to date? Any kind of degree in ICT is rapidly outdated. Frequency of update training is also a good indicator.How do we measure basic digital skills?Instead of 'number', 'share' or 'percentage' would be more accurate.Budget allocated to new ICT training (to keep up with new softwares, new processes, new services...)Other incentives: faster (no need to physically go to the premises), more inclusive (for example, for people who don't have conventional schedules where they can go the their town hall from 9:00 to 17:00), real time update on the status of their demand...Incentives can also be on the side of employees; it can be beneficial for them.If so, how? How long? What kind of onboarding? Who provides it? How can we ask for it? The rate of citizens that want a training and that actually get the training. Depending on the capacity, there might be a discrepancy between the willingness of citizens to receive training, and the capacity of the civil servants to provide such training. Communication campaign made to incentivise the use of digital services? Money put into communication? Number of bugs that have been reported/fixed, and in how much time.I think it would be beneficial also to know what is the share of citizens that use regularly the services. If there is high number of recent engagement because the services was launched last year, but no one used it more than once, then the figure is not that relevant anymore. Need to differentiate acquisition rate and retention rate.More than the number of initiatives, number of comments and feedback taken into account in the newer version of the service.The share of online services that offer the possibility to add/edit pieces of information during the process (so that the process can be agile during its pending status, and does not need to start again if one piece of information or one document is missing).How is citizen's behaviour monitored? And analysed? Through polls? Focus group?Who received the feedback? Dedicated team? IT team? Administration? How it the feedback handled and by whom?And then compare it with the share of citizens who use it for digital services. For example, if only 70% of citizens use eID for digital services, it means that eID is a good enabler to use digital services. Or perhaps people might have eID but don't use it on digital services, which might be due to a communication issue. Does the local authority have a clear process on how to process redresses and complaints? Such as a time frame to address the complaint? And is this process communicated to citizens and businesses? The redress mechanisms indicators can be more granular.The average time spent by citizens and businesses on one action and their evolution throughout time. For repetitive actions (such as declaring taxes for example): do people get used to it over time and do it faster? The share of satisfied city hall's employees (are they happy with the trainings, the new processes, their efficiency, the reduction of dissatisfied customers...)Number of e-learning courses related to ICT for employees and citizensI'd also add: number (or percentage) of public services integrated with standardised services modules provided by higher institutional bodiesDo the services implement the once only principle? To what extent?Does the local authority periodically run usability tests involving panels of citizens?Does the local authority provide SLA / maximum time for service delivery?Besides financial savings, I will also measure (increased) efficiency and qualityGood indicator, would advise two others as well: 1. Percentage of professionals with UX experience on the total population (huge city with one UX specialist vs small village with one UX specialist) 2. Try to determine the quality of the people working on/with UX, the existence of (what number / percentage of) processes in which UX is mentioned. In the latter it of course matters whether or not the UX is executed during the processI dont think the number of ICT specialists (they can have many different orientations) is an indicator, an enablerDifficult to measure, but interesting. I would rahther also see the number of citizens involved in the devop op public servicesTraining in how to use Word, in system engineering, in data analytics, in .... What training in ICT? I do not see how this is an indicator, an un-/enabler* 'Standardised' processes in which involving the users is mentioned. * 'Standardised' processes in which UX-testing is mentioned. * The number of UX-specialists * The number of citizens /the network that can and want to be reached in the design and deveop of public services.I would like to ask this question in the first place (right on top of the dashboard)Besides creating awareness on the existence of digital public services (besides physical ones), helping citizens to use these services is immportant. However, is that training, or just providing a manual?? If training is required, maybe the digital service is not designed and developed properly.The threshold to useing newly developed digital services must be as low as possible (max accessibility). So, max on communication (and long campaign), max on explanation (on why digital is used as well as the existence of an easy to understand manual/intuitive preferably). Then, incentives are not so much needed. It would be the right and easy choice.Difficult to reason why this should be on the dashboard, why it should not be. Being able to work together with your stake- and shareholders requires good data management (incl API's). If the API is needed to offer the public services, then this is a good indicator for the dashboard. Somoething like that, I think. Depends, calls by who? Private companies, or citizens??ID management (e-ID) in my opinion is one of the keys for offering trustworthy and safe public servicesConcerning the API's, it is relevant to know whether there is a poprtal/desk where stakeholders can ask quastions about the metadata of the API, ask about new datasets/API's etc. Fine incidicator, to start with (is not stand-alone)I would add info on the demographics of these citizens (representation). What region of the city, age, education, profession, .... etcIn addition to the number there could be extra question whether in developing new services people with such expertise are involved.This is totally dependent on the structure of the local government, how many sub-institutions etc.Agreed with the previous commenters. Training citizens on digital public service use seems like a very resource intensive thing to do. Shouldn't the services be intuitive to use, so they don't require extra training? Better indicator might be are citizens aware where and how to access these services.As some local governments are quite big, this can vary between departments.There should be % or number of services where it has been implemented. Have the principles of a user-centered approach been used in digital services? Is there a procedure that says that it also means asking and testing from real users and adjusting for the result. Or does it mean that the service provider says that I know what consumers want, which is more common but less effective.Really important is whether something is done with this feedback.Are there any options-differences that can be discussed and asked outside of laconic automatic choices on digital channels? In other words, if the choices given in the digital do not cover my need, if possible I can reach a human factor who can discuss this difference with me. For example, if the home school is artistic, but the child is realistic and could be at least one more choice, but the digital solution seems to limit the possibilities and create a reality that is ready to be taken into account. A real example of life itself.Do digital services have a unified creation process? In other words, the local government has a clear procedure that a digital service cannot be developed until the "business side" is in place and well thought out.What is meant by that?Does the digital service have a goal-value, responsible person, metrics that are a prerequisite for the (continuous) development of a well-thought-out service.Add another indicator here, for example: "Is the feedback analyzed and acted upon on a continuous basis?"Specifically addressing digital divideLack of in-depth categorisation of users (especially age, level of education) for over-time analysisNumber of city apps and their topics, what are cities offering their citizensSeparate between platforms, apps, participation tools, chatbots, etc. especially the plain bureaucratic e-governance procedures It also depends a lot on the city's service creation strategy. An IT department that completely outsources the creation of services is not the same as one that creates them in-house.In addition to this indicator is important to know the number of professionals than works into business units, who knows about it, and do not work in IT department. May be they are "IT embassadors"It's important to make difference between having IT skills to develop, operations, etc or having IT skills to manage with technology. In this direction, it is important to focus in the second one.Number of organisms/ department / spaces available to participationI consider this indicator very important, because the service standards make easies to develop ecosystems of stakeholders around the city. For example, in Madrid, the MiNT protocol handle a service standard who uses all the companies that make services to maintance of the city